Apr 24 23:51:15.884107 ip-10-0-128-234 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 23:51:15.884121 ip-10-0-128-234 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 23:51:15.884132 ip-10-0-128-234 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 23:51:15.884533 ip-10-0-128-234 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 23:51:25.948229 ip-10-0-128-234 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 23:51:25.948249 ip-10-0-128-234 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1402b254fb2848f895285fc9cf31640f -- Apr 24 23:53:35.724094 ip-10-0-128-234 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:36.167262 ip-10-0-128-234 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:36.167262 ip-10-0-128-234 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:36.167262 ip-10-0-128-234 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:36.167262 ip-10-0-128-234 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:36.167262 ip-10-0-128-234 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:36.170077 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.169912 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:36.172438 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172415 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:36.172438 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172435 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:36.172438 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172440 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:36.172438 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172444 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172448 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172453 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172459 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172463 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172467 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172474 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172481 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172485 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172489 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172493 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172498 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172502 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172506 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172510 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172514 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172519 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172525 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172531 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:36.172667 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172537 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172542 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172546 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172551 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172555 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172560 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172564 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172568 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172573 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172577 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172581 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172586 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172590 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172594 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172598 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172603 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172608 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172612 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172616 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172621 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172625 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:36.173442 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172629 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172633 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172637 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172642 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172647 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172651 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172655 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172670 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172673 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172677 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172682 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172687 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172691 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172695 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172699 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172703 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172707 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172712 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172716 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:36.174327 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172720 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172724 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172728 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172733 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172738 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172742 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172745 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172751 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172755 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172762 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172766 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172770 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172790 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172795 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172799 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172803 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172808 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172813 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172818 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172822 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:36.174908 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172827 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172831 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172835 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.172840 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173764 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173812 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173817 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173822 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173833 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173839 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173847 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173855 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173860 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173865 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173870 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173875 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173880 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173885 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173890 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173896 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:36.175402 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173906 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173911 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173915 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173920 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173925 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173930 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173934 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173939 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173943 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173948 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173954 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173958 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173963 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173972 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173977 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173983 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173987 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173992 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.173997 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174001 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:36.175951 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174006 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174011 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174015 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174019 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174024 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174029 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174042 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174046 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174051 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174055 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174060 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174066 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174070 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174074 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174079 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174085 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174091 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174095 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174105 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:36.176441 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174111 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174115 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174120 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174124 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174129 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174133 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174138 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174142 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174147 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174153 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174157 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174167 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174171 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174176 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174181 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174185 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174190 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174197 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174201 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174206 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:36.176929 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174210 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174215 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174220 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174230 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174236 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174240 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174246 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174250 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174256 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174261 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.174266 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175794 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175933 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175945 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175950 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175956 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175960 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175965 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175971 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175974 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175977 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:36.177430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175980 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175984 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175988 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175991 2575 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175994 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.175997 2575 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176000 2575 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176003 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176006 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176012 2575 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176015 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176019 2575 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176023 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176027 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176032 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176035 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176039 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176043 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176047 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176050 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176053 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176056 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176059 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176065 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176068 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:36.177970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176071 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176074 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176078 2575 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176081 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176086 2575 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176089 2575 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176093 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176096 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176099 2575 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176103 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176106 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176109 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176112 2575 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176115 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176118 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176121 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176124 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176127 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176130 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176133 2575 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176137 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176140 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176145 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176148 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176152 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:36.178580 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176155 2575 flags.go:64] FLAG: --help="false" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176158 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176161 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176164 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176167 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176171 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176176 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176179 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176182 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176184 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176188 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176191 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176194 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176197 2575 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176200 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176203 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176206 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176209 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176212 2575 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176215 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176217 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176221 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176226 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:36.179212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176229 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176232 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176235 2575 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176239 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176243 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176246 2575 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176250 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176256 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176259 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176263 2575 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176266 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176270 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176273 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176276 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176279 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176282 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176285 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176301 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176304 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176308 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176311 2575 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176314 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176320 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176323 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:36.179788 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176326 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176329 2575 flags.go:64] FLAG: --port="10250" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176333 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176335 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-070ebef341449c11c" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176339 2575 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176342 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176345 2575 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176348 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176351 2575 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176355 2575 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176358 2575 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176361 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176364 2575 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176368 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176372 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176375 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176378 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176388 2575 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176391 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176394 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176397 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176400 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176403 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176407 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176410 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176413 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:36.180368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176416 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176419 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176422 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176426 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176429 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176432 2575 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176435 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176441 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176444 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176447 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176454 2575 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176457 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176460 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176463 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176466 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176469 2575 flags.go:64] FLAG: --v="2" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176474 2575 flags.go:64] FLAG: --version="false" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176478 2575 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176483 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.176486 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176588 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176593 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176596 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176599 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:36.181087 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176608 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176611 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176614 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176617 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176620 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176623 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176626 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176629 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176632 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176635 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176637 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176640 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176643 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176646 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176649 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176652 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176655 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176658 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176661 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176664 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:36.181671 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176667 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176670 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176672 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176675 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176678 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176680 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176683 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176685 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176688 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176691 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176694 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176696 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176699 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176707 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176710 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176712 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176717 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176721 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176723 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176726 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:36.182208 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176728 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176731 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176734 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176739 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176742 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176744 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176747 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176750 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176752 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176756 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176758 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176761 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176764 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176766 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176769 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176771 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176774 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176796 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176799 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176801 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:36.182698 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176804 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176807 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176810 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176812 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176815 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176818 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176827 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176830 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176832 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176835 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176837 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176840 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176842 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176845 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176847 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176851 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176854 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176856 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176859 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:36.183485 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176861 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:36.184322 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176865 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:36.184322 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.176869 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:36.184322 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.177617 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:36.184956 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.184934 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:36.184956 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.184956 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185029 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185037 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185043 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185050 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185055 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185059 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185063 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185068 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185073 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185077 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:36.185079 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185081 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185086 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185090 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185105 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185110 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185114 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185118 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185122 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185127 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185131 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185135 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185139 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185143 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185147 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185151 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185155 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185160 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185164 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185168 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185172 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:36.185601 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185176 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185181 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185185 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185188 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185192 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185199 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185206 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185211 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185215 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185220 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185224 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185228 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185232 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185236 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185241 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185246 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185251 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185256 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185260 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:36.186358 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185265 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185270 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185274 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185278 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185282 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185287 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185291 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185295 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185299 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185306 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185311 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185316 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185320 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185325 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185329 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185333 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185338 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185342 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185347 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185351 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:36.187127 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185355 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185360 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185364 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185368 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185372 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185376 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185381 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185385 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185390 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185394 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185399 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185404 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185408 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185412 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185416 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185421 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:36.187864 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185425 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.185432 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185595 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185603 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185607 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185611 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185616 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185620 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185624 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185629 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185634 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185638 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185643 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185648 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185652 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:36.188298 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185656 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185660 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185664 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185669 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185674 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185678 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185683 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185688 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185693 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185697 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185701 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185706 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185717 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185722 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185726 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185730 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185735 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185739 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185743 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185748 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:36.188669 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185752 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185756 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185761 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185767 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185771 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185797 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185802 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185806 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185810 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185814 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185818 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185822 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185826 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185832 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185837 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185842 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185846 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185851 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185855 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185859 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:36.189212 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185863 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185867 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185872 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185876 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185880 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185887 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185893 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185899 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185904 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185908 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185913 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185918 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185922 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185927 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185931 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185935 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185939 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185944 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185948 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185953 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:36.190048 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185957 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185961 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185965 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185970 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185974 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185979 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185983 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185987 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185991 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185995 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.185999 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.186003 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:36.186008 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.186016 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:36.190653 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.186790 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:36.191165 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.190922 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:36.192079 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.192066 2575 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:36.192184 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.192164 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:36.192226 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.192213 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:36.217613 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.217578 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:36.223638 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.223608 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:36.240136 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.240109 2575 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:36.245510 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.245479 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:36.246559 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.246543 2575 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:36.247833 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.247817 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:36.253594 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.253574 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a20bab32-a8cd-458f-a897-dca5fa5b1d53:/dev/nvme0n1p3 d6c00ad3-95f9-4548-b7ef-b78b225661c5:/dev/nvme0n1p4] Apr 24 23:53:36.253646 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.253595 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:36.259655 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.259533 2575 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:36.257419025 +0000 UTC m=+0.411032586 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3133148 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec204cc3434f43521e7a69ad1f004faa SystemUUID:ec204cc3-434f-4352-1e7a-69ad1f004faa BootID:1402b254-fb28-48f8-9528-5fc9cf31640f Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b6:36:b2:ff:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b6:36:b2:ff:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:93:e1:75:10:94 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:36.259655 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.259644 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:36.259806 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.259736 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:36.260903 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.260872 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:36.261073 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.260905 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-234.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:36.261117 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.261084 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:36.261117 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.261093 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:36.261117 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.261107 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:36.261973 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.261962 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:36.262722 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.262698 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-crjbz" Apr 24 23:53:36.262864 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.262850 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:36.263002 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.262993 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:36.265509 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.265499 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:36.265566 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.265521 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:36.265566 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.265534 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:36.265566 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.265544 2575 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:36.265566 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.265557 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:36.266699 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.266686 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:36.266789 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.266705 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:36.269614 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.269591 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-crjbz" Apr 24 23:53:36.269827 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.269812 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:36.271229 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.271215 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:36.272620 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272609 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:36.272662 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272627 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:36.272662 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272634 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:36.272662 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272640 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:36.272662 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272648 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:36.272662 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272657 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:36.272808 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272666 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:36.272808 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272672 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:36.272808 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272679 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:36.272808 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272685 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:36.272808 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272701 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:36.272808 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.272710 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:36.273570 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.273558 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:36.273604 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.273572 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:36.277669 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.277642 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:36.277767 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.277714 2575 server.go:1295] "Started kubelet" Apr 24 23:53:36.278281 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.278246 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:36.278400 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.278130 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:36.278400 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.278325 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:36.278887 ip-10-0-128-234 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:36.279609 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.279555 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:36.281388 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.281367 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:36.285137 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.285110 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:36.285231 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.285172 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-234.ec2.internal" not found Apr 24 23:53:36.285422 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.285405 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:36.288354 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.288335 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:36.288620 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.288584 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:36.289319 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:36.289280 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-234.ec2.internal\" not found" Apr 24 23:53:36.289438 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289425 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:36.289501 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289428 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:36.289501 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289452 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:36.289501 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289464 2575 factory.go:55] Registering systemd factory Apr 24 23:53:36.289501 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289484 2575 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:36.289671 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289536 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:36.289671 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289548 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:36.289735 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289701 2575 factory.go:153] Registering CRI-O factory Apr 24 23:53:36.289735 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289711 2575 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:36.289813 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289769 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:36.289813 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289811 2575 factory.go:103] Registering Raw factory Apr 24 23:53:36.289883 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.289825 2575 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:36.290276 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:36.290252 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:36.290466 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.290451 2575 manager.go:319] Starting recovery of all containers Apr 24 23:53:36.290534 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.290479 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:36.293867 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:36.293842 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-234.ec2.internal\" not found" node="ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.299935 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.299802 2575 manager.go:324] Recovery completed Apr 24 23:53:36.300030 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.299836 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-234.ec2.internal" not found Apr 24 23:53:36.304463 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.304451 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:36.306392 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.306203 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-234.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:36.306466 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.306432 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:36.306466 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.306451 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-234.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:36.307235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.307219 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:36.307235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.307235 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:36.307375 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.307256 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:36.310372 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.310357 2575 policy_none.go:49] "None policy: Start" Apr 24 23:53:36.310439 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.310384 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:36.310439 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.310396 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.359303 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-234.ec2.internal" not found Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.361514 2575 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:36.361550 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.361561 2575 server.go:85] "Starting device plugin registration server" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.361875 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.361888 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.362000 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.362086 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.362095 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:36.362632 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:36.364505 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:36.362681 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-234.ec2.internal\" not found" Apr 24 23:53:36.408326 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.408284 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:36.409546 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.409527 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:36.409620 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.409559 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:36.409620 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.409580 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:36.409620 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.409587 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:36.409727 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:36.409627 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:36.411795 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.411744 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:36.462638 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.462541 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:36.463528 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.463510 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-234.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:36.463637 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.463546 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:36.463637 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.463561 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-234.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:36.463637 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.463592 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.471475 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.471453 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.471475 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:36.471481 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-234.ec2.internal\": node \"ip-10-0-128-234.ec2.internal\" not found" Apr 24 23:53:36.510507 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.510463 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal"] Apr 24 23:53:36.514704 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.514687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.514704 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.514698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.536628 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.536598 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.541160 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.541141 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.544431 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.544416 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:36.553947 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.553924 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:36.590871 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.590846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aef0222d478965f43e6fdd10ed145026-config\") pod \"kube-apiserver-proxy-ip-10-0-128-234.ec2.internal\" (UID: \"aef0222d478965f43e6fdd10ed145026\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.591012 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.590879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d47070605c0a9645ad2e709bbb472a77-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal\" (UID: \"d47070605c0a9645ad2e709bbb472a77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.591012 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.590908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47070605c0a9645ad2e709bbb472a77-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal\" (UID: \"d47070605c0a9645ad2e709bbb472a77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.691839 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.691802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aef0222d478965f43e6fdd10ed145026-config\") pod \"kube-apiserver-proxy-ip-10-0-128-234.ec2.internal\" (UID: \"aef0222d478965f43e6fdd10ed145026\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.691839 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.691831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d47070605c0a9645ad2e709bbb472a77-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal\" (UID: \"d47070605c0a9645ad2e709bbb472a77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.692055 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.691852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47070605c0a9645ad2e709bbb472a77-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal\" (UID: \"d47070605c0a9645ad2e709bbb472a77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.692055 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.691902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47070605c0a9645ad2e709bbb472a77-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal\" (UID: \"d47070605c0a9645ad2e709bbb472a77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.692055 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.691902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d47070605c0a9645ad2e709bbb472a77-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal\" (UID: \"d47070605c0a9645ad2e709bbb472a77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.692055 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.691906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aef0222d478965f43e6fdd10ed145026-config\") pod \"kube-apiserver-proxy-ip-10-0-128-234.ec2.internal\" (UID: \"aef0222d478965f43e6fdd10ed145026\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.848488 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.848385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" Apr 24 23:53:36.856873 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:36.856846 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" Apr 24 23:53:37.192276 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.192198 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:37.192905 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.192365 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:37.192905 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.192396 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:37.266169 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.266134 2575 apiserver.go:52] "Watching apiserver" Apr 24 23:53:37.271240 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.271205 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:36 +0000 UTC" deadline="2027-11-26 19:23:44.361759541 +0000 UTC" Apr 24 23:53:37.271368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.271253 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13939h30m7.0905122s" Apr 24 23:53:37.274092 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.274077 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:37.275507 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.275486 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xvbxz","openshift-network-diagnostics/network-check-target-clwv5","kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr","openshift-cluster-node-tuning-operator/tuned-lxw9p","openshift-multus/multus-additional-cni-plugins-bmv4v","openshift-network-operator/iptables-alerter-2w7lz","openshift-ovn-kubernetes/ovnkube-node-4822f","kube-system/konnectivity-agent-zgjcf","openshift-dns/node-resolver-rpxzt","openshift-image-registry/node-ca-7662p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal","openshift-multus/multus-pkrwp"] Apr 24 23:53:37.278921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.278897 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:37.279046 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.278995 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:37.281146 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.281123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:37.281250 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.281201 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:37.285194 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.285173 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.285274 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.285204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.287428 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.287406 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.287540 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.287522 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:37.287602 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.287557 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:37.287693 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.287677 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:37.287833 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.287818 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:37.287833 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.287829 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xvdhn\"" Apr 24 23:53:37.287939 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.287917 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:37.288642 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.288625 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7zkw6\"" Apr 24 23:53:37.289341 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.289327 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:37.289463 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.289448 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.289868 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.289850 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:37.289971 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.289959 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:37.290023 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.289969 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:37.290173 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.290159 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:37.290252 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.290238 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:37.290357 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.290344 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hhhm8\"" Apr 24 23:53:37.291622 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.291606 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:37.291726 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.291709 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:37.291790 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.291741 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.292008 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.291961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n7rsm\"" Apr 24 23:53:37.292093 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.292020 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:37.293325 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysctl-d\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293372 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcd9\" (UniqueName: \"kubernetes.io/projected/f11aa6c0-4d7d-4326-84df-857c34aa6e63-kube-api-access-2kcd9\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.293372 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.293372 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-socket-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.293481 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-systemd\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293481 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293413 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-lib-modules\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293481 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-tuned\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293663 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-var-lib-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.293663 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-ovn\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.293663 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-registration-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.293663 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293569 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-host\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293663 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:37.293916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-etc-selinux\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.293916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-kubernetes\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/173885a1-11c8-47c2-a5b4-51ef670b7bc6-tmp\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:37.293916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-sys-fs\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.293916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-modprobe-d\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-sys\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.293916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-var-lib-kubelet\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-systemd-units\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.293980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovnkube-script-lib\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-cni-bin\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-cni-netd\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cnibin\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htc6w\" (UniqueName: \"kubernetes.io/projected/41d8f92c-2b55-41bd-b446-69fde40a9e8e-kube-api-access-htc6w\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294108 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw9ns\" (UniqueName: \"kubernetes.io/projected/908b6dc1-8fd0-4631-8022-d81ae6d15f95-kube-api-access-zw9ns\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkmg2\" (UniqueName: \"kubernetes.io/projected/148a2391-987d-4318-b295-01018903ff94-kube-api-access-kkmg2\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovnkube-config\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-systemd\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-etc-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysctl-conf\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-node-log\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-os-release\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294403 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqqvc\" (UniqueName: \"kubernetes.io/projected/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-kube-api-access-rqqvc\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294420 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovn-node-metrics-cert\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cni-binary-copy\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcx7l\" (UniqueName: \"kubernetes.io/projected/173885a1-11c8-47c2-a5b4-51ef670b7bc6-kube-api-access-qcx7l\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-run-netns\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-log-socket\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysconfig\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294569 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-env-overrides\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294636 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/908b6dc1-8fd0-4631-8022-d81ae6d15f95-iptables-alerter-script\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.294848 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-kubelet\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.295286 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-system-cni-dir\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.295286 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-device-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.295286 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-run\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.295286 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294738 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-slash\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.295286 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.294764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/908b6dc1-8fd0-4631-8022-d81ae6d15f95-host-slash\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.295512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.295369 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:37.295512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.295388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-94qv5\"" Apr 24 23:53:37.295512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.295391 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:37.295512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.295399 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:37.295512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.295468 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:37.295512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.295493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:37.295512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.295512 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:37.296762 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.296735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.296866 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.296850 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dcgnw\"" Apr 24 23:53:37.296926 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.296911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:37.297002 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.296770 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:37.299243 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.298994 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:37.299243 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.299218 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cv7z5\"" Apr 24 23:53:37.299390 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.299221 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:37.300948 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.300932 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.303320 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.303304 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:37.303486 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.303473 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:37.303772 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.303747 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.303855 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.303811 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8s2v9\"" Apr 24 23:53:37.304069 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.304055 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:37.304801 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.304768 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:37.306084 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.306054 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:37.306168 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.306154 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9nv4r\"" Apr 24 23:53:37.325516 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.325496 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-r9ft4" Apr 24 23:53:37.333923 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.333894 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-r9ft4" Apr 24 23:53:37.354862 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.354832 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd47070605c0a9645ad2e709bbb472a77.slice/crio-3e83c01d9f8ea05206dfc4a75e853fcbe192672f4d78615d181a48d51e4871eb WatchSource:0}: Error finding container 3e83c01d9f8ea05206dfc4a75e853fcbe192672f4d78615d181a48d51e4871eb: Status 404 returned error can't find the container with id 3e83c01d9f8ea05206dfc4a75e853fcbe192672f4d78615d181a48d51e4871eb Apr 24 23:53:37.359729 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.359706 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef0222d478965f43e6fdd10ed145026.slice/crio-f0eb407f398e5eb3c6cdd474be4644367e5190dc664066cedf3d47ad1648df7e WatchSource:0}: Error finding container f0eb407f398e5eb3c6cdd474be4644367e5190dc664066cedf3d47ad1648df7e: Status 404 returned error can't find the container with id f0eb407f398e5eb3c6cdd474be4644367e5190dc664066cedf3d47ad1648df7e Apr 24 23:53:37.359863 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.359796 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:37.390125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.390108 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:37.395315 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.395283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysconfig\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.395467 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.395443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysconfig\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.395577 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.395554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.395701 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.395686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-env-overrides\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.395845 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.395791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.395947 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.395856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/908b6dc1-8fd0-4631-8022-d81ae6d15f95-iptables-alerter-script\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.396010 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.395903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/55b4791c-ab54-4f79-a22b-f9adb92a1461-serviceca\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.396010 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.395982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-kubelet\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.396103 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-kubelet\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.396201 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-system-cni-dir\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.396468 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-system-cni-dir\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.396521 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-device-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.396521 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-run\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.396624 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-slash\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.396624 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/908b6dc1-8fd0-4631-8022-d81ae6d15f95-host-slash\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.396624 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysctl-d\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.396624 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kcd9\" (UniqueName: \"kubernetes.io/projected/f11aa6c0-4d7d-4326-84df-857c34aa6e63-kube-api-access-2kcd9\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.396624 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-run\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/908b6dc1-8fd0-4631-8022-d81ae6d15f95-host-slash\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/60d87af7-f253-4e76-9605-d6707237c596-tmp-dir\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396680 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-cni-bin\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-device-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-cni-multus\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysctl-d\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-etc-kubernetes\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-slash\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/908b6dc1-8fd0-4631-8022-d81ae6d15f95-iptables-alerter-script\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.396921 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-socket-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-systemd\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-lib-modules\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-tuned\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.396989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-systemd\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-var-lib-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-socket-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-var-lib-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-ovn\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-ovn\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/60d87af7-f253-4e76-9605-d6707237c596-hosts-file\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-env-overrides\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-registration-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-host\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-registration-dir\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-cnibin\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc9da0e8-bb12-42fb-a6da-363511285477-cni-binary-copy\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-lib-modules\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.397383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-host\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-socket-dir-parent\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397260 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-k8s-cni-cncf-io\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-hostroot\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkt4\" (UniqueName: \"kubernetes.io/projected/bc9da0e8-bb12-42fb-a6da-363511285477-kube-api-access-lqkt4\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-etc-selinux\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-kubernetes\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.397440 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-etc-selinux\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/173885a1-11c8-47c2-a5b4-51ef670b7bc6-tmp\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-kubernetes\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.397528 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:37.897489501 +0000 UTC m=+2.051103033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc9da0e8-bb12-42fb-a6da-363511285477-multus-daemon-config\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-sys-fs\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.398064 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-modprobe-d\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/41d8f92c-2b55-41bd-b446-69fde40a9e8e-sys-fs\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-modprobe-d\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-sys\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-var-lib-kubelet\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-systemd-units\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-sys\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlxc8\" (UniqueName: \"kubernetes.io/projected/60d87af7-f253-4e76-9605-d6707237c596-kube-api-access-mlxc8\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-var-lib-kubelet\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-multus-certs\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-systemd-units\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397904 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovnkube-script-lib\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-cni-bin\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-cni-bin\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.397974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-cni-netd\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.399565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-cni-netd\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cnibin\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8ft\" (UniqueName: \"kubernetes.io/projected/55b4791c-ab54-4f79-a22b-f9adb92a1461-kube-api-access-4m8ft\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-os-release\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-kubelet\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htc6w\" (UniqueName: \"kubernetes.io/projected/41d8f92c-2b55-41bd-b446-69fde40a9e8e-kube-api-access-htc6w\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398141 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cnibin\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw9ns\" (UniqueName: \"kubernetes.io/projected/908b6dc1-8fd0-4631-8022-d81ae6d15f95-kube-api-access-zw9ns\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a3a145f6-4ef4-43fd-985d-2692fdc60a0b-konnectivity-ca\") pod \"konnectivity-agent-zgjcf\" (UID: \"a3a145f6-4ef4-43fd-985d-2692fdc60a0b\") " pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkmg2\" (UniqueName: \"kubernetes.io/projected/148a2391-987d-4318-b295-01018903ff94-kube-api-access-kkmg2\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovnkube-config\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-systemd\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-etc-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.400366 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55b4791c-ab54-4f79-a22b-f9adb92a1461-host\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovnkube-script-lib\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398413 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-cni-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a3a145f6-4ef4-43fd-985d-2692fdc60a0b-agent-certs\") pod \"konnectivity-agent-zgjcf\" (UID: \"a3a145f6-4ef4-43fd-985d-2692fdc60a0b\") " pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-etc-openvswitch\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysctl-conf\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-node-log\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-run-systemd\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398517 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-os-release\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398572 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-node-log\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-sysctl-conf\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-os-release\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqqvc\" (UniqueName: \"kubernetes.io/projected/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-kube-api-access-rqqvc\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-conf-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovn-node-metrics-cert\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cni-binary-copy\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-system-cni-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcx7l\" (UniqueName: \"kubernetes.io/projected/173885a1-11c8-47c2-a5b4-51ef670b7bc6-kube-api-access-qcx7l\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.398984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.399032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-run-netns\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.399068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-log-socket\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.399100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-netns\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.399126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovnkube-config\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.399154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-log-socket\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.399189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11aa6c0-4d7d-4326-84df-857c34aa6e63-host-run-netns\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.399323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-cni-binary-copy\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.399349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.401242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/173885a1-11c8-47c2-a5b4-51ef670b7bc6-etc-tuned\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.401291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/173885a1-11c8-47c2-a5b4-51ef670b7bc6-tmp\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.401658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.401322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f11aa6c0-4d7d-4326-84df-857c34aa6e63-ovn-node-metrics-cert\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.404098 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.404081 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:37.404152 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.404105 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:37.404152 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.404120 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgjzd for pod openshift-network-diagnostics/network-check-target-clwv5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:37.404213 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.404188 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd podName:eb26be0d-42dc-4350-8240-8da8402a51a3 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:37.904168429 +0000 UTC m=+2.057781991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dgjzd" (UniqueName: "kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd") pod "network-check-target-clwv5" (UID: "eb26be0d-42dc-4350-8240-8da8402a51a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:37.404317 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.404302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kcd9\" (UniqueName: \"kubernetes.io/projected/f11aa6c0-4d7d-4326-84df-857c34aa6e63-kube-api-access-2kcd9\") pod \"ovnkube-node-4822f\" (UID: \"f11aa6c0-4d7d-4326-84df-857c34aa6e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.407205 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.407182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqqvc\" (UniqueName: \"kubernetes.io/projected/53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb-kube-api-access-rqqvc\") pod \"multus-additional-cni-plugins-bmv4v\" (UID: \"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb\") " pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.407348 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.407327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw9ns\" (UniqueName: \"kubernetes.io/projected/908b6dc1-8fd0-4631-8022-d81ae6d15f95-kube-api-access-zw9ns\") pod \"iptables-alerter-2w7lz\" (UID: \"908b6dc1-8fd0-4631-8022-d81ae6d15f95\") " pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.407415 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.407397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkmg2\" (UniqueName: \"kubernetes.io/projected/148a2391-987d-4318-b295-01018903ff94-kube-api-access-kkmg2\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:37.407831 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.407817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcx7l\" (UniqueName: \"kubernetes.io/projected/173885a1-11c8-47c2-a5b4-51ef670b7bc6-kube-api-access-qcx7l\") pod \"tuned-lxw9p\" (UID: \"173885a1-11c8-47c2-a5b4-51ef670b7bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.408669 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.408649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htc6w\" (UniqueName: \"kubernetes.io/projected/41d8f92c-2b55-41bd-b446-69fde40a9e8e-kube-api-access-htc6w\") pod \"aws-ebs-csi-driver-node-jtxqr\" (UID: \"41d8f92c-2b55-41bd-b446-69fde40a9e8e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.413406 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.413368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" event={"ID":"aef0222d478965f43e6fdd10ed145026","Type":"ContainerStarted","Data":"f0eb407f398e5eb3c6cdd474be4644367e5190dc664066cedf3d47ad1648df7e"} Apr 24 23:53:37.414408 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.414390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" event={"ID":"d47070605c0a9645ad2e709bbb472a77","Type":"ContainerStarted","Data":"3e83c01d9f8ea05206dfc4a75e853fcbe192672f4d78615d181a48d51e4871eb"} Apr 24 23:53:37.500296 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55b4791c-ab54-4f79-a22b-f9adb92a1461-host\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.500296 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55b4791c-ab54-4f79-a22b-f9adb92a1461-host\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-cni-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a3a145f6-4ef4-43fd-985d-2692fdc60a0b-agent-certs\") pod \"konnectivity-agent-zgjcf\" (UID: \"a3a145f6-4ef4-43fd-985d-2692fdc60a0b\") " pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-conf-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-system-cni-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-conf-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-netns\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-cni-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/55b4791c-ab54-4f79-a22b-f9adb92a1461-serviceca\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-system-cni-dir\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/60d87af7-f253-4e76-9605-d6707237c596-tmp-dir\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.500519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-netns\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-cni-bin\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-cni-multus\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-etc-kubernetes\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/60d87af7-f253-4e76-9605-d6707237c596-hosts-file\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-cni-bin\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-cni-multus\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-cnibin\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-etc-kubernetes\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc9da0e8-bb12-42fb-a6da-363511285477-cni-binary-copy\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/60d87af7-f253-4e76-9605-d6707237c596-hosts-file\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-cnibin\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-socket-dir-parent\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-k8s-cni-cncf-io\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500843 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-multus-socket-dir-parent\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-hostroot\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/60d87af7-f253-4e76-9605-d6707237c596-tmp-dir\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkt4\" (UniqueName: \"kubernetes.io/projected/bc9da0e8-bb12-42fb-a6da-363511285477-kube-api-access-lqkt4\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-hostroot\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc9da0e8-bb12-42fb-a6da-363511285477-multus-daemon-config\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-k8s-cni-cncf-io\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlxc8\" (UniqueName: \"kubernetes.io/projected/60d87af7-f253-4e76-9605-d6707237c596-kube-api-access-mlxc8\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/55b4791c-ab54-4f79-a22b-f9adb92a1461-serviceca\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.500993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-multus-certs\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501045 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-run-multus-certs\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m8ft\" (UniqueName: \"kubernetes.io/projected/55b4791c-ab54-4f79-a22b-f9adb92a1461-kube-api-access-4m8ft\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501092 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-os-release\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-kubelet\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a3a145f6-4ef4-43fd-985d-2692fdc60a0b-konnectivity-ca\") pod \"konnectivity-agent-zgjcf\" (UID: \"a3a145f6-4ef4-43fd-985d-2692fdc60a0b\") " pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-os-release\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc9da0e8-bb12-42fb-a6da-363511285477-host-var-lib-kubelet\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc9da0e8-bb12-42fb-a6da-363511285477-cni-binary-copy\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc9da0e8-bb12-42fb-a6da-363511285477-multus-daemon-config\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.501644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.501620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a3a145f6-4ef4-43fd-985d-2692fdc60a0b-konnectivity-ca\") pod \"konnectivity-agent-zgjcf\" (UID: \"a3a145f6-4ef4-43fd-985d-2692fdc60a0b\") " pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:37.502851 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.502832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a3a145f6-4ef4-43fd-985d-2692fdc60a0b-agent-certs\") pod \"konnectivity-agent-zgjcf\" (UID: \"a3a145f6-4ef4-43fd-985d-2692fdc60a0b\") " pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:37.509091 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.509070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkt4\" (UniqueName: \"kubernetes.io/projected/bc9da0e8-bb12-42fb-a6da-363511285477-kube-api-access-lqkt4\") pod \"multus-pkrwp\" (UID: \"bc9da0e8-bb12-42fb-a6da-363511285477\") " pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.509207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.509144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m8ft\" (UniqueName: \"kubernetes.io/projected/55b4791c-ab54-4f79-a22b-f9adb92a1461-kube-api-access-4m8ft\") pod \"node-ca-7662p\" (UID: \"55b4791c-ab54-4f79-a22b-f9adb92a1461\") " pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.509250 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.509230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlxc8\" (UniqueName: \"kubernetes.io/projected/60d87af7-f253-4e76-9605-d6707237c596-kube-api-access-mlxc8\") pod \"node-resolver-rpxzt\" (UID: \"60d87af7-f253-4e76-9605-d6707237c596\") " pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.612376 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.612344 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" Apr 24 23:53:37.618592 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.618565 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d8f92c_2b55_41bd_b446_69fde40a9e8e.slice/crio-b74cd1aa4e69ade36daac25a85a60ff583bb1c1647fe1a939959f2cd21bad46c WatchSource:0}: Error finding container b74cd1aa4e69ade36daac25a85a60ff583bb1c1647fe1a939959f2cd21bad46c: Status 404 returned error can't find the container with id b74cd1aa4e69ade36daac25a85a60ff583bb1c1647fe1a939959f2cd21bad46c Apr 24 23:53:37.630933 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.630909 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" Apr 24 23:53:37.636759 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.636735 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod173885a1_11c8_47c2_a5b4_51ef670b7bc6.slice/crio-8530de237d7ba05103e2684cc0c615a604ee51118f6f7076a9b7fa902c1afa2e WatchSource:0}: Error finding container 8530de237d7ba05103e2684cc0c615a604ee51118f6f7076a9b7fa902c1afa2e: Status 404 returned error can't find the container with id 8530de237d7ba05103e2684cc0c615a604ee51118f6f7076a9b7fa902c1afa2e Apr 24 23:53:37.662289 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.662258 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" Apr 24 23:53:37.668538 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.668510 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53cf6d5a_8951_44fe_a1f1_b382fb7ffbdb.slice/crio-ec5bfd3f557d9d92b9af146ab0f0e87a698d54b293a63e7862ee20f69bb2d7c2 WatchSource:0}: Error finding container ec5bfd3f557d9d92b9af146ab0f0e87a698d54b293a63e7862ee20f69bb2d7c2: Status 404 returned error can't find the container with id ec5bfd3f557d9d92b9af146ab0f0e87a698d54b293a63e7862ee20f69bb2d7c2 Apr 24 23:53:37.672140 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.672118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2w7lz" Apr 24 23:53:37.677773 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.677751 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908b6dc1_8fd0_4631_8022_d81ae6d15f95.slice/crio-90c6e6389758b54e2ce3a0a8655521b6bc5dff0a15caddea1cab0ff9905a8e04 WatchSource:0}: Error finding container 90c6e6389758b54e2ce3a0a8655521b6bc5dff0a15caddea1cab0ff9905a8e04: Status 404 returned error can't find the container with id 90c6e6389758b54e2ce3a0a8655521b6bc5dff0a15caddea1cab0ff9905a8e04 Apr 24 23:53:37.678380 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.678361 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:53:37.683766 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.683737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:37.684955 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.684934 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11aa6c0_4d7d_4326_84df_857c34aa6e63.slice/crio-070abec34fbbcaea17a1643e70add17755c81dbeb81bf6e3d47727502216bd2c WatchSource:0}: Error finding container 070abec34fbbcaea17a1643e70add17755c81dbeb81bf6e3d47727502216bd2c: Status 404 returned error can't find the container with id 070abec34fbbcaea17a1643e70add17755c81dbeb81bf6e3d47727502216bd2c Apr 24 23:53:37.690095 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.690073 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rpxzt" Apr 24 23:53:37.690435 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.690410 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3a145f6_4ef4_43fd_985d_2692fdc60a0b.slice/crio-817bcf3ba6ecd8953cfd8f166f48a716b3d81698d862451f022eebee36a1a471 WatchSource:0}: Error finding container 817bcf3ba6ecd8953cfd8f166f48a716b3d81698d862451f022eebee36a1a471: Status 404 returned error can't find the container with id 817bcf3ba6ecd8953cfd8f166f48a716b3d81698d862451f022eebee36a1a471 Apr 24 23:53:37.696198 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.696170 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60d87af7_f253_4e76_9605_d6707237c596.slice/crio-8b59d822fd60e54bfd48171ba1207ae62ca575e57b2c1139a3be07d1541556dc WatchSource:0}: Error finding container 8b59d822fd60e54bfd48171ba1207ae62ca575e57b2c1139a3be07d1541556dc: Status 404 returned error can't find the container with id 8b59d822fd60e54bfd48171ba1207ae62ca575e57b2c1139a3be07d1541556dc Apr 24 23:53:37.697380 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.697357 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7662p" Apr 24 23:53:37.703611 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.703587 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b4791c_ab54_4f79_a22b_f9adb92a1461.slice/crio-0723d4c6534c2e63ceda521f0cb9d19980c73cac97bdfe321976991347b810a3 WatchSource:0}: Error finding container 0723d4c6534c2e63ceda521f0cb9d19980c73cac97bdfe321976991347b810a3: Status 404 returned error can't find the container with id 0723d4c6534c2e63ceda521f0cb9d19980c73cac97bdfe321976991347b810a3 Apr 24 23:53:37.717961 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.717933 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pkrwp" Apr 24 23:53:37.726074 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:53:37.726043 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9da0e8_bb12_42fb_a6da_363511285477.slice/crio-9dea8e762be6133f4529e5bbc3301f7ebd94f48a234f9cb826a9231310b07056 WatchSource:0}: Error finding container 9dea8e762be6133f4529e5bbc3301f7ebd94f48a234f9cb826a9231310b07056: Status 404 returned error can't find the container with id 9dea8e762be6133f4529e5bbc3301f7ebd94f48a234f9cb826a9231310b07056 Apr 24 23:53:37.903897 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:37.903816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:37.903897 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.903889 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:37.904122 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:37.903960 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:38.903941255 +0000 UTC m=+3.057554798 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:38.004532 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.004450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:38.004723 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:38.004700 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:38.004806 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:38.004730 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:38.004806 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:38.004744 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgjzd for pod openshift-network-diagnostics/network-check-target-clwv5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:38.004923 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:38.004820 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd podName:eb26be0d-42dc-4350-8240-8da8402a51a3 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:39.004801796 +0000 UTC m=+3.158415332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgjzd" (UniqueName: "kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd") pod "network-check-target-clwv5" (UID: "eb26be0d-42dc-4350-8240-8da8402a51a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:38.245092 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.244050 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:38.335304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.335248 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:37 +0000 UTC" deadline="2027-11-21 17:10:19.43453386 +0000 UTC" Apr 24 23:53:38.335304 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.335291 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13817h16m41.099247123s" Apr 24 23:53:38.424322 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.424209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zgjcf" event={"ID":"a3a145f6-4ef4-43fd-985d-2692fdc60a0b","Type":"ContainerStarted","Data":"817bcf3ba6ecd8953cfd8f166f48a716b3d81698d862451f022eebee36a1a471"} Apr 24 23:53:38.429887 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.429825 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" event={"ID":"173885a1-11c8-47c2-a5b4-51ef670b7bc6","Type":"ContainerStarted","Data":"8530de237d7ba05103e2684cc0c615a604ee51118f6f7076a9b7fa902c1afa2e"} Apr 24 23:53:38.450916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.450877 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" event={"ID":"41d8f92c-2b55-41bd-b446-69fde40a9e8e","Type":"ContainerStarted","Data":"b74cd1aa4e69ade36daac25a85a60ff583bb1c1647fe1a939959f2cd21bad46c"} Apr 24 23:53:38.456958 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.456898 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7662p" event={"ID":"55b4791c-ab54-4f79-a22b-f9adb92a1461","Type":"ContainerStarted","Data":"0723d4c6534c2e63ceda521f0cb9d19980c73cac97bdfe321976991347b810a3"} Apr 24 23:53:38.478013 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.477950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rpxzt" event={"ID":"60d87af7-f253-4e76-9605-d6707237c596","Type":"ContainerStarted","Data":"8b59d822fd60e54bfd48171ba1207ae62ca575e57b2c1139a3be07d1541556dc"} Apr 24 23:53:38.495314 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.495273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"070abec34fbbcaea17a1643e70add17755c81dbeb81bf6e3d47727502216bd2c"} Apr 24 23:53:38.504320 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.504283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2w7lz" event={"ID":"908b6dc1-8fd0-4631-8022-d81ae6d15f95","Type":"ContainerStarted","Data":"90c6e6389758b54e2ce3a0a8655521b6bc5dff0a15caddea1cab0ff9905a8e04"} Apr 24 23:53:38.511965 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.511808 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" event={"ID":"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb","Type":"ContainerStarted","Data":"ec5bfd3f557d9d92b9af146ab0f0e87a698d54b293a63e7862ee20f69bb2d7c2"} Apr 24 23:53:38.525902 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.525434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pkrwp" event={"ID":"bc9da0e8-bb12-42fb-a6da-363511285477","Type":"ContainerStarted","Data":"9dea8e762be6133f4529e5bbc3301f7ebd94f48a234f9cb826a9231310b07056"} Apr 24 23:53:38.718204 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.718121 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:38.911484 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:38.911443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:38.911672 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:38.911617 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:38.911733 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:38.911681 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:40.911661483 +0000 UTC m=+5.065275014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:39.012944 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:39.012853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:39.013115 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:39.013031 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:39.013115 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:39.013049 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:39.013115 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:39.013062 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgjzd for pod openshift-network-diagnostics/network-check-target-clwv5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:39.013269 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:39.013121 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd podName:eb26be0d-42dc-4350-8240-8da8402a51a3 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:41.013101843 +0000 UTC m=+5.166715382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgjzd" (UniqueName: "kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd") pod "network-check-target-clwv5" (UID: "eb26be0d-42dc-4350-8240-8da8402a51a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:39.335978 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:39.335885 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:37 +0000 UTC" deadline="2027-10-15 23:15:06.113628355 +0000 UTC" Apr 24 23:53:39.335978 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:39.335930 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12935h21m26.777702867s" Apr 24 23:53:39.409955 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:39.409919 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:39.410138 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:39.410058 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:39.410198 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:39.410181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:39.410293 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:39.410268 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:40.927164 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:40.927044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:40.927635 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:40.927248 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:40.927635 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:40.927327 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.927305985 +0000 UTC m=+9.080919533 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:41.028184 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:41.028144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:41.028367 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:41.028332 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:41.028367 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:41.028358 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:41.028495 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:41.028372 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgjzd for pod openshift-network-diagnostics/network-check-target-clwv5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:41.028495 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:41.028436 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd podName:eb26be0d-42dc-4350-8240-8da8402a51a3 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:45.028416635 +0000 UTC m=+9.182030181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgjzd" (UniqueName: "kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd") pod "network-check-target-clwv5" (UID: "eb26be0d-42dc-4350-8240-8da8402a51a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:41.410623 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:41.410121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:41.410623 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:41.410128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:41.410623 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:41.410257 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:41.410623 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:41.410616 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:43.410307 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:43.410268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:43.410831 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:43.410372 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:43.410831 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:43.410455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:43.410831 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:43.410504 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:44.963446 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:44.963397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:44.963941 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:44.963547 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.963941 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:44.963625 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:52.963602356 +0000 UTC m=+17.117215907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:45.064188 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:45.064147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:45.064411 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:45.064382 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:45.064411 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:45.064415 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:45.064579 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:45.064431 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgjzd for pod openshift-network-diagnostics/network-check-target-clwv5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:45.064579 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:45.064503 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd podName:eb26be0d-42dc-4350-8240-8da8402a51a3 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:53.06448033 +0000 UTC m=+17.218093867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgjzd" (UniqueName: "kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd") pod "network-check-target-clwv5" (UID: "eb26be0d-42dc-4350-8240-8da8402a51a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:45.410656 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:45.410612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:45.410863 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:45.410741 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:45.410863 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:45.410834 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:45.410975 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:45.410958 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:47.410397 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:47.410202 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:47.410869 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:47.410220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:47.410869 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:47.410500 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:47.410869 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:47.410582 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:49.409840 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:49.409799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:49.410266 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:49.409799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:49.410266 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:49.409932 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:49.410266 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:49.410015 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:51.410368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:51.410332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:51.410794 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:51.410458 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:51.410794 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:51.410487 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:51.410794 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:51.410596 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:53.019494 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:53.019458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:53.019967 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:53.019605 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:53.019967 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:53.019664 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:09.019646015 +0000 UTC m=+33.173259549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:53.120076 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:53.120030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:53.120253 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:53.120194 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:53.120253 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:53.120217 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:53.120253 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:53.120227 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgjzd for pod openshift-network-diagnostics/network-check-target-clwv5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:53.120381 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:53.120319 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd podName:eb26be0d-42dc-4350-8240-8da8402a51a3 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:09.120301491 +0000 UTC m=+33.273915034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgjzd" (UniqueName: "kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd") pod "network-check-target-clwv5" (UID: "eb26be0d-42dc-4350-8240-8da8402a51a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:53.410045 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:53.410007 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:53.410234 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:53.410007 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:53.410234 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:53.410125 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:53.410234 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:53.410200 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:55.410537 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:55.410496 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:55.410928 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:55.410496 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:55.410928 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:55.410617 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:55.410928 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:55.410679 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:56.571254 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.570842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" event={"ID":"aef0222d478965f43e6fdd10ed145026","Type":"ContainerStarted","Data":"a3a6856433e555c0b5760de49c23853b2120bf43ac0c4307026c62c798aa0288"} Apr 24 23:53:56.572322 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.572301 2575 generic.go:358] "Generic (PLEG): container finished" podID="d47070605c0a9645ad2e709bbb472a77" containerID="0b2354654793f7d05cd3bb239a5ddee949726a98da24006df89330810a344d1f" exitCode=0 Apr 24 23:53:56.572388 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.572360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" event={"ID":"d47070605c0a9645ad2e709bbb472a77","Type":"ContainerDied","Data":"0b2354654793f7d05cd3bb239a5ddee949726a98da24006df89330810a344d1f"} Apr 24 23:53:56.576401 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.576374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pkrwp" event={"ID":"bc9da0e8-bb12-42fb-a6da-363511285477","Type":"ContainerStarted","Data":"15299038246e70f83dba674056baf6906e3ad72534535ca3956b160a8ac6e4f5"} Apr 24 23:53:56.577966 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.577940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zgjcf" event={"ID":"a3a145f6-4ef4-43fd-985d-2692fdc60a0b","Type":"ContainerStarted","Data":"090ac29d20e1d353af9c74917a682b910856d0d4787b89fe490a7131f44e802f"} Apr 24 23:53:56.579280 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.579250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" event={"ID":"173885a1-11c8-47c2-a5b4-51ef670b7bc6","Type":"ContainerStarted","Data":"7ef595acbd20bcd062931e3aab00bc3cf62f39473dd595997214d80c3b8435e4"} Apr 24 23:53:56.580542 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.580511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" event={"ID":"41d8f92c-2b55-41bd-b446-69fde40a9e8e","Type":"ContainerStarted","Data":"82953088cfbe23e2278f4f66bce0dc83b879a79ad8aff662add0471aebe92db7"} Apr 24 23:53:56.581711 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.581685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7662p" event={"ID":"55b4791c-ab54-4f79-a22b-f9adb92a1461","Type":"ContainerStarted","Data":"f5ee4f9a94d22ce5de2ece0661d5fd6b722c5ed13bc56775d758694a1d19e7ea"} Apr 24 23:53:56.582659 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.582620 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-234.ec2.internal" podStartSLOduration=20.582605895 podStartE2EDuration="20.582605895s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:56.582324617 +0000 UTC m=+20.735938170" watchObservedRunningTime="2026-04-24 23:53:56.582605895 +0000 UTC m=+20.736219446" Apr 24 23:53:56.583028 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.582997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rpxzt" event={"ID":"60d87af7-f253-4e76-9605-d6707237c596","Type":"ContainerStarted","Data":"85f4c0d505653371c4b30d8678fe59e5f1a1791f2b926a774ed2baf83e126418"} Apr 24 23:53:56.585710 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.585691 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"82c643abcbe1d2a2f1133789a2bde722c3ce7d7682dbd91ff92401268af51fa6"} Apr 24 23:53:56.585817 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.585717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"8aa91c65309b924388c1e203810a7f1227634b6f3fc78bdc8e57473acf5075b0"} Apr 24 23:53:56.585817 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.585731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"be42416c6e70947e8239db5fbf0f78e975800ece70ffeaa913de2df89a13c517"} Apr 24 23:53:56.585817 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.585740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"2eafc5737f95d37d5bd582ff5ca8eeb482691856835e8ce7468884d6bc56a3be"} Apr 24 23:53:56.585817 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.585748 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"478871bc19e1198c48b84564d303d48da104d10b4fd1319a06d53e28521e45d9"} Apr 24 23:53:56.585817 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.585756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"bec52d90d47d18164ed0655b9c213903073623bda6162853c7d27fbd2d7c78ea"} Apr 24 23:53:56.587057 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.587035 2575 generic.go:358] "Generic (PLEG): container finished" podID="53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb" containerID="f90907073d3e685dc2938ce5b556a739def82a3e3bad40cd3146865fe8bb208b" exitCode=0 Apr 24 23:53:56.587117 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.587069 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" event={"ID":"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb","Type":"ContainerDied","Data":"f90907073d3e685dc2938ce5b556a739def82a3e3bad40cd3146865fe8bb208b"} Apr 24 23:53:56.600098 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.600052 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lxw9p" podStartSLOduration=2.7140015 podStartE2EDuration="20.6000359s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.638347634 +0000 UTC m=+1.791961163" lastFinishedPulling="2026-04-24 23:53:55.52438203 +0000 UTC m=+19.677995563" observedRunningTime="2026-04-24 23:53:56.599765658 +0000 UTC m=+20.753379208" watchObservedRunningTime="2026-04-24 23:53:56.6000359 +0000 UTC m=+20.753649452" Apr 24 23:53:56.626265 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.626149 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zgjcf" podStartSLOduration=2.846017791 podStartE2EDuration="20.626134301s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.692875741 +0000 UTC m=+1.846489271" lastFinishedPulling="2026-04-24 23:53:55.472992252 +0000 UTC m=+19.626605781" observedRunningTime="2026-04-24 23:53:56.625796632 +0000 UTC m=+20.779410176" watchObservedRunningTime="2026-04-24 23:53:56.626134301 +0000 UTC m=+20.779747851" Apr 24 23:53:56.642765 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.642715 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7662p" podStartSLOduration=2.924910115 podStartE2EDuration="20.642701672s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.707257735 +0000 UTC m=+1.860871269" lastFinishedPulling="2026-04-24 23:53:55.425049288 +0000 UTC m=+19.578662826" observedRunningTime="2026-04-24 23:53:56.642618849 +0000 UTC m=+20.796232401" watchObservedRunningTime="2026-04-24 23:53:56.642701672 +0000 UTC m=+20.796315223" Apr 24 23:53:56.668083 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.668037 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pkrwp" podStartSLOduration=2.839069768 podStartE2EDuration="20.668022923s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.72821886 +0000 UTC m=+1.881832395" lastFinishedPulling="2026-04-24 23:53:55.557172018 +0000 UTC m=+19.710785550" observedRunningTime="2026-04-24 23:53:56.667360858 +0000 UTC m=+20.820974409" watchObservedRunningTime="2026-04-24 23:53:56.668022923 +0000 UTC m=+20.821636474" Apr 24 23:53:56.696819 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.696748 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rpxzt" podStartSLOduration=2.869789258 podStartE2EDuration="20.696733986s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.697541619 +0000 UTC m=+1.851155150" lastFinishedPulling="2026-04-24 23:53:55.524486333 +0000 UTC m=+19.678099878" observedRunningTime="2026-04-24 23:53:56.696648364 +0000 UTC m=+20.850261915" watchObservedRunningTime="2026-04-24 23:53:56.696733986 +0000 UTC m=+20.850347536" Apr 24 23:53:56.961501 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.961403 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:56.962160 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:56.962135 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:57.410511 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.410477 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:57.410648 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.410550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:57.410745 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:57.410718 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:53:57.410882 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:57.410854 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:57.451670 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.451647 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:53:57.590643 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.590539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" event={"ID":"41d8f92c-2b55-41bd-b446-69fde40a9e8e","Type":"ContainerStarted","Data":"b9c965e069c6c99787ab94f4b39b7bdd0411ec5dffe43318ac6aff1e29888264"} Apr 24 23:53:57.592012 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.591969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2w7lz" event={"ID":"908b6dc1-8fd0-4631-8022-d81ae6d15f95","Type":"ContainerStarted","Data":"878ee5150d04961d314e3e0ee13acbb9095cb0f246cf1159406c26368d89ff0b"} Apr 24 23:53:57.593823 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.593712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" event={"ID":"d47070605c0a9645ad2e709bbb472a77","Type":"ContainerStarted","Data":"81b2ecfcb96c8120a04fba393ae066ac49a5cdf492d2b16b2e629cecb12ccd43"} Apr 24 23:53:57.606462 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.606418 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2w7lz" podStartSLOduration=3.785952071 podStartE2EDuration="21.606401388s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.679301088 +0000 UTC m=+1.832914617" lastFinishedPulling="2026-04-24 23:53:55.499750399 +0000 UTC m=+19.653363934" observedRunningTime="2026-04-24 23:53:57.606213295 +0000 UTC m=+21.759826845" watchObservedRunningTime="2026-04-24 23:53:57.606401388 +0000 UTC m=+21.760014941" Apr 24 23:53:57.618406 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.618352 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-234.ec2.internal" podStartSLOduration=21.61833976 podStartE2EDuration="21.61833976s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:57.617930567 +0000 UTC m=+21.771544115" watchObservedRunningTime="2026-04-24 23:53:57.61833976 +0000 UTC m=+21.771953310" Apr 24 23:53:57.663250 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.663215 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:57.663876 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:57.663850 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zgjcf" Apr 24 23:53:58.374269 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:58.374163 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:53:57.451665981Z","UUID":"ace4ca50-f29b-454f-abf8-1f6ee64419c7","Handler":null,"Name":"","Endpoint":""} Apr 24 23:53:58.377272 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:58.376367 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:53:58.377272 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:58.376397 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:53:58.598223 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:58.598129 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" event={"ID":"41d8f92c-2b55-41bd-b446-69fde40a9e8e","Type":"ContainerStarted","Data":"fe4546e0f4707dc804bdb19a0194c9821030e1739e151eb50dcad56cbc683077"} Apr 24 23:53:58.601627 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:58.601591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"ecf608ccee272db61bd20478edc61d3a9faa1c85339f212825f71c2cccfebc1b"} Apr 24 23:53:58.625942 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:58.625886 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtxqr" podStartSLOduration=1.9932921179999998 podStartE2EDuration="22.625871559s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.620686576 +0000 UTC m=+1.774300105" lastFinishedPulling="2026-04-24 23:53:58.253266014 +0000 UTC m=+22.406879546" observedRunningTime="2026-04-24 23:53:58.62543797 +0000 UTC m=+22.779051520" watchObservedRunningTime="2026-04-24 23:53:58.625871559 +0000 UTC m=+22.779485109" Apr 24 23:53:59.410668 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:59.410640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:53:59.411038 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:53:59.410640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:53:59.411038 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:59.410759 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:53:59.411038 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:53:59.410878 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:54:00.550968 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.550802 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-csxnd"] Apr 24 23:54:00.553572 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.553555 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.553655 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:00.553621 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csxnd" podUID="50a991ab-3e74-4c00-bb7b-b6b5ca42b15f" Apr 24 23:54:00.610484 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.610383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" event={"ID":"f11aa6c0-4d7d-4326-84df-857c34aa6e63","Type":"ContainerStarted","Data":"c4d085c21c2610a71ad3a96bab0fbc36a31e5a04c761cf10df29257847bab884"} Apr 24 23:54:00.611191 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.610906 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:54:00.611191 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.610932 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:54:00.611191 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.610947 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:54:00.635007 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.634079 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:54:00.635007 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.634168 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:54:00.669910 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.669874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-kubelet-config\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.670052 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.669925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-dbus\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.670052 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.669953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.675488 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.675440 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" podStartSLOduration=6.534971234 podStartE2EDuration="24.675428469s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.687299991 +0000 UTC m=+1.840913523" lastFinishedPulling="2026-04-24 23:53:55.827757226 +0000 UTC m=+19.981370758" observedRunningTime="2026-04-24 23:54:00.673950098 +0000 UTC m=+24.827563649" watchObservedRunningTime="2026-04-24 23:54:00.675428469 +0000 UTC m=+24.829042020" Apr 24 23:54:00.771117 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.771081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-kubelet-config\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.771117 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.771120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-dbus\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.771322 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.771145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.771322 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.771209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-kubelet-config\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.771379 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:00.771328 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:00.771408 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:00.771380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-dbus\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:00.771441 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:00.771412 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret podName:50a991ab-3e74-4c00-bb7b-b6b5ca42b15f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:01.271391165 +0000 UTC m=+25.425004696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret") pod "global-pull-secret-syncer-csxnd" (UID: "50a991ab-3e74-4c00-bb7b-b6b5ca42b15f") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.275204 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:01.275168 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:01.275348 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:01.275281 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.275348 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:01.275332 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret podName:50a991ab-3e74-4c00-bb7b-b6b5ca42b15f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:02.275317824 +0000 UTC m=+26.428931353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret") pod "global-pull-secret-syncer-csxnd" (UID: "50a991ab-3e74-4c00-bb7b-b6b5ca42b15f") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.409814 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:01.409759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:54:01.409957 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:01.409760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:01.409957 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:01.409894 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:54:01.410041 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:01.409953 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:54:01.613996 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:01.613909 2575 generic.go:358] "Generic (PLEG): container finished" podID="53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb" containerID="99214c6e981ad7b29be08bb29b82adc081e073805f27a402062638ee8abe845c" exitCode=0 Apr 24 23:54:01.614743 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:01.613997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" event={"ID":"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb","Type":"ContainerDied","Data":"99214c6e981ad7b29be08bb29b82adc081e073805f27a402062638ee8abe845c"} Apr 24 23:54:02.284944 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.284542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:02.284944 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:02.284939 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:02.285155 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:02.285014 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret podName:50a991ab-3e74-4c00-bb7b-b6b5ca42b15f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:04.284994122 +0000 UTC m=+28.438607651 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret") pod "global-pull-secret-syncer-csxnd" (UID: "50a991ab-3e74-4c00-bb7b-b6b5ca42b15f") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:02.410641 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.410601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:02.410815 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:02.410745 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csxnd" podUID="50a991ab-3e74-4c00-bb7b-b6b5ca42b15f" Apr 24 23:54:02.501633 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.501607 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-csxnd"] Apr 24 23:54:02.504209 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.504186 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-clwv5"] Apr 24 23:54:02.504309 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.504281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:02.504404 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:02.504379 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:54:02.515826 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.515795 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xvbxz"] Apr 24 23:54:02.515978 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.515918 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:54:02.516054 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:02.516034 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:54:02.617673 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.617585 2575 generic.go:358] "Generic (PLEG): container finished" podID="53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb" containerID="6a6af628029a7cb25129c8769f990a9da78b29370b61d1b29123006953939685" exitCode=0 Apr 24 23:54:02.618084 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.617677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:02.618084 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:02.617671 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" event={"ID":"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb","Type":"ContainerDied","Data":"6a6af628029a7cb25129c8769f990a9da78b29370b61d1b29123006953939685"} Apr 24 23:54:02.618084 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:02.617953 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csxnd" podUID="50a991ab-3e74-4c00-bb7b-b6b5ca42b15f" Apr 24 23:54:03.621231 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:03.621192 2575 generic.go:358] "Generic (PLEG): container finished" podID="53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb" containerID="2ea23ff7caf77330c2634bcc5e7215bd68db3f99a00362ebd78dcf18d70f324a" exitCode=0 Apr 24 23:54:03.621610 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:03.621257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" event={"ID":"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb","Type":"ContainerDied","Data":"2ea23ff7caf77330c2634bcc5e7215bd68db3f99a00362ebd78dcf18d70f324a"} Apr 24 23:54:04.300138 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:04.300105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:04.300322 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:04.300289 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:04.300376 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:04.300358 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret podName:50a991ab-3e74-4c00-bb7b-b6b5ca42b15f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.300341746 +0000 UTC m=+32.453955275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret") pod "global-pull-secret-syncer-csxnd" (UID: "50a991ab-3e74-4c00-bb7b-b6b5ca42b15f") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:04.409978 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:04.409944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:04.410157 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:04.409944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:54:04.410157 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:04.410080 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:54:04.410273 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:04.410167 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:54:04.410273 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:04.409944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:04.410366 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:04.410306 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csxnd" podUID="50a991ab-3e74-4c00-bb7b-b6b5ca42b15f" Apr 24 23:54:06.410982 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:06.410787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:06.411551 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:06.410863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:06.411551 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:06.411062 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:54:06.411551 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:06.411168 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csxnd" podUID="50a991ab-3e74-4c00-bb7b-b6b5ca42b15f" Apr 24 23:54:06.411551 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:06.410890 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:54:06.411551 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:06.411275 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:54:08.331001 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.330962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:08.331531 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.331094 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:08.331531 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.331154 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret podName:50a991ab-3e74-4c00-bb7b-b6b5ca42b15f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.3311404 +0000 UTC m=+40.484753929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret") pod "global-pull-secret-syncer-csxnd" (UID: "50a991ab-3e74-4c00-bb7b-b6b5ca42b15f") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:08.410476 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.410440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:08.410625 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.410440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:54:08.410625 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.410583 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csxnd" podUID="50a991ab-3e74-4c00-bb7b-b6b5ca42b15f" Apr 24 23:54:08.410742 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.410440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:08.410742 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.410676 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:54:08.410837 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.410736 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clwv5" podUID="eb26be0d-42dc-4350-8240-8da8402a51a3" Apr 24 23:54:08.661510 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.661478 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-234.ec2.internal" event="NodeReady" Apr 24 23:54:08.661718 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.661651 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:08.725153 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.725121 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6kzrg"] Apr 24 23:54:08.747601 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.747566 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-c5wq4"] Apr 24 23:54:08.747789 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.747757 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.750472 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.750447 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:08.750472 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.750468 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:08.750985 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.750965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-drk7t\"" Apr 24 23:54:08.770288 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.770260 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kzrg"] Apr 24 23:54:08.770288 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.770287 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c5wq4"] Apr 24 23:54:08.770478 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.770412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:08.774030 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.774005 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:08.774178 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.774084 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:08.774178 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.774172 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:08.774414 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.774396 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gqd2q\"" Apr 24 23:54:08.835642 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.835609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5e433d1-e134-4312-9418-0c609e10c09c-config-volume\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.835851 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.835648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:08.835851 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.835726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5e433d1-e134-4312-9418-0c609e10c09c-tmp-dir\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.835851 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.835794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9p2x\" (UniqueName: \"kubernetes.io/projected/efeb0526-c248-4f97-ad71-12762132cd18-kube-api-access-d9p2x\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:08.835986 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.835892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkw2f\" (UniqueName: \"kubernetes.io/projected/f5e433d1-e134-4312-9418-0c609e10c09c-kube-api-access-dkw2f\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.835986 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.835957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.937072 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.936983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkw2f\" (UniqueName: \"kubernetes.io/projected/f5e433d1-e134-4312-9418-0c609e10c09c-kube-api-access-dkw2f\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.937072 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.937059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.937290 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.937093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5e433d1-e134-4312-9418-0c609e10c09c-config-volume\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.937290 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.937118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:08.937290 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.937182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5e433d1-e134-4312-9418-0c609e10c09c-tmp-dir\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.937290 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.937204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9p2x\" (UniqueName: \"kubernetes.io/projected/efeb0526-c248-4f97-ad71-12762132cd18-kube-api-access-d9p2x\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:08.937290 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.937236 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:08.937290 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.937289 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:08.937530 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.937312 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:09.437289365 +0000 UTC m=+33.590902896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:54:08.937530 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:08.937331 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:09.43731983 +0000 UTC m=+33.590933366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:54:08.937671 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.937636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5e433d1-e134-4312-9418-0c609e10c09c-config-volume\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.937671 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.937656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5e433d1-e134-4312-9418-0c609e10c09c-tmp-dir\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.952019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.951988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkw2f\" (UniqueName: \"kubernetes.io/projected/f5e433d1-e134-4312-9418-0c609e10c09c-kube-api-access-dkw2f\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:08.952174 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:08.951993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9p2x\" (UniqueName: \"kubernetes.io/projected/efeb0526-c248-4f97-ad71-12762132cd18-kube-api-access-d9p2x\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:09.038174 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:09.038133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:54:09.038341 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.038308 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:09.038395 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.038382 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:41.038364769 +0000 UTC m=+65.191978321 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:09.139055 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:09.139014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:09.139222 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.139200 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:09.139304 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.139230 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:09.139304 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.139245 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgjzd for pod openshift-network-diagnostics/network-check-target-clwv5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:09.139383 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.139310 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd podName:eb26be0d-42dc-4350-8240-8da8402a51a3 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:41.139290829 +0000 UTC m=+65.292904363 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgjzd" (UniqueName: "kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd") pod "network-check-target-clwv5" (UID: "eb26be0d-42dc-4350-8240-8da8402a51a3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:09.440814 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:09.440761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:09.441377 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:09.440826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:09.441377 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.440882 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:09.441377 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.440932 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:09.441377 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.440950 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:10.440934096 +0000 UTC m=+34.594547626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:54:09.441377 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:09.440969 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:10.440958234 +0000 UTC m=+34.594571762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:54:10.409946 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.409901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:10.409946 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.409939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:10.410192 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.409987 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:54:10.420531 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.420509 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:10.421278 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.421262 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qtxb2\"" Apr 24 23:54:10.421826 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.421805 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:10.421826 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.421819 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:10.421974 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.421810 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:10.423672 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.423656 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjkj6\"" Apr 24 23:54:10.447261 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.447227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:10.447261 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.447269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:10.447707 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:10.447380 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:10.447707 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:10.447398 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:10.447707 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:10.447439 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:12.447424878 +0000 UTC m=+36.601038407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:54:10.447707 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:10.447475 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:12.447457451 +0000 UTC m=+36.601070999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:54:10.637099 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.637065 2575 generic.go:358] "Generic (PLEG): container finished" podID="53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb" containerID="9071d921e8ad82979e3d09216495b37cf994c1dda70ef531628957211878d879" exitCode=0 Apr 24 23:54:10.637379 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:10.637114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" event={"ID":"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb","Type":"ContainerDied","Data":"9071d921e8ad82979e3d09216495b37cf994c1dda70ef531628957211878d879"} Apr 24 23:54:11.641460 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:11.641423 2575 generic.go:358] "Generic (PLEG): container finished" podID="53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb" containerID="2ca1960e6b71287d73c957438206085c8cdb29af4247607bd2bb0d7d275427c2" exitCode=0 Apr 24 23:54:11.641861 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:11.641482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" event={"ID":"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb","Type":"ContainerDied","Data":"2ca1960e6b71287d73c957438206085c8cdb29af4247607bd2bb0d7d275427c2"} Apr 24 23:54:12.462473 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:12.462433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:12.462473 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:12.462480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:12.462685 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:12.462587 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:12.462685 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:12.462597 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:12.462685 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:12.462640 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.462625445 +0000 UTC m=+40.616238975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:54:12.462685 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:12.462655 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.462647703 +0000 UTC m=+40.616261231 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:54:12.647593 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:12.647558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" event={"ID":"53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb","Type":"ContainerStarted","Data":"b2d18cc2aa644d706116973557cf92f4283f201fd25b04e4e12c8a58ea80526b"} Apr 24 23:54:12.672532 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:12.672481 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bmv4v" podStartSLOduration=4.524388309 podStartE2EDuration="36.672463433s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.670399954 +0000 UTC m=+1.824013497" lastFinishedPulling="2026-04-24 23:54:09.818475079 +0000 UTC m=+33.972088621" observedRunningTime="2026-04-24 23:54:12.670795099 +0000 UTC m=+36.824408641" watchObservedRunningTime="2026-04-24 23:54:12.672463433 +0000 UTC m=+36.826076983" Apr 24 23:54:16.390825 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:16.390767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:16.393902 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:16.393877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50a991ab-3e74-4c00-bb7b-b6b5ca42b15f-original-pull-secret\") pod \"global-pull-secret-syncer-csxnd\" (UID: \"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f\") " pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:16.429211 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:16.429174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csxnd" Apr 24 23:54:16.492081 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:16.492041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:16.492247 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:16.492136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:16.492247 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:16.492205 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:16.492247 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:16.492229 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:16.492339 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:16.492273 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:24.492257181 +0000 UTC m=+48.645870710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:54:16.492339 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:16.492288 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:24.49228152 +0000 UTC m=+48.645895049 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:54:16.608284 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:16.608249 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-csxnd"] Apr 24 23:54:16.612302 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:54:16.612273 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50a991ab_3e74_4c00_bb7b_b6b5ca42b15f.slice/crio-39a9c98402a9e6b9360a94eb6cf6c42960dd25713bfa5bf70e5ae29816ba5255 WatchSource:0}: Error finding container 39a9c98402a9e6b9360a94eb6cf6c42960dd25713bfa5bf70e5ae29816ba5255: Status 404 returned error can't find the container with id 39a9c98402a9e6b9360a94eb6cf6c42960dd25713bfa5bf70e5ae29816ba5255 Apr 24 23:54:16.655618 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:16.655578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-csxnd" event={"ID":"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f","Type":"ContainerStarted","Data":"39a9c98402a9e6b9360a94eb6cf6c42960dd25713bfa5bf70e5ae29816ba5255"} Apr 24 23:54:21.665938 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:21.665900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-csxnd" event={"ID":"50a991ab-3e74-4c00-bb7b-b6b5ca42b15f","Type":"ContainerStarted","Data":"5aacab5e63eec9667e5c01160beb5b0b79a7468f96066bba368c0df0c6171ca3"} Apr 24 23:54:24.550301 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:24.550260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:24.550301 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:24.550309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:24.550725 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:24.550405 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:24.550725 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:24.550411 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:24.550725 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:24.550469 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:40.550452388 +0000 UTC m=+64.704065917 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:54:24.550725 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:24.550484 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:40.550476293 +0000 UTC m=+64.704089821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:54:32.631440 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:32.631408 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4822f" Apr 24 23:54:32.660014 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:32.659957 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-csxnd" podStartSLOduration=28.764092398 podStartE2EDuration="32.659941417s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:16.614136442 +0000 UTC m=+40.767749985" lastFinishedPulling="2026-04-24 23:54:20.509985461 +0000 UTC m=+44.663599004" observedRunningTime="2026-04-24 23:54:21.683678904 +0000 UTC m=+45.837292457" watchObservedRunningTime="2026-04-24 23:54:32.659941417 +0000 UTC m=+56.813554968" Apr 24 23:54:40.567418 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:40.567380 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:54:40.567844 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:40.567467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:54:40.567844 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:40.567552 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:40.567844 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:40.567609 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:55:12.567594745 +0000 UTC m=+96.721208274 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:54:40.567844 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:40.567552 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:40.567844 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:40.567648 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:12.56763993 +0000 UTC m=+96.721253458 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:54:41.072388 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.072347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:54:41.075406 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.075385 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:41.083624 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:41.083601 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:54:41.083728 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:54:41.083717 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:45.083696583 +0000 UTC m=+129.237310121 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : secret "metrics-daemon-secret" not found Apr 24 23:54:41.173038 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.172984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:41.175807 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.175769 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:41.186444 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.186417 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:41.196903 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.196879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgjzd\" (UniqueName: \"kubernetes.io/projected/eb26be0d-42dc-4350-8240-8da8402a51a3-kube-api-access-dgjzd\") pod \"network-check-target-clwv5\" (UID: \"eb26be0d-42dc-4350-8240-8da8402a51a3\") " pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:41.328341 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.328249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjkj6\"" Apr 24 23:54:41.336375 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.336347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:41.468025 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.467995 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-clwv5"] Apr 24 23:54:41.470947 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:54:41.470918 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb26be0d_42dc_4350_8240_8da8402a51a3.slice/crio-8f97b73052ec99f3c7c91986f92e143b3011e2b0d36d3ee2f2e4210d119d8f8b WatchSource:0}: Error finding container 8f97b73052ec99f3c7c91986f92e143b3011e2b0d36d3ee2f2e4210d119d8f8b: Status 404 returned error can't find the container with id 8f97b73052ec99f3c7c91986f92e143b3011e2b0d36d3ee2f2e4210d119d8f8b Apr 24 23:54:41.703369 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:41.703337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-clwv5" event={"ID":"eb26be0d-42dc-4350-8240-8da8402a51a3","Type":"ContainerStarted","Data":"8f97b73052ec99f3c7c91986f92e143b3011e2b0d36d3ee2f2e4210d119d8f8b"} Apr 24 23:54:44.042968 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.042929 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz"] Apr 24 23:54:44.045560 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.045543 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.048615 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.048592 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 23:54:44.049798 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.049760 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 23:54:44.049798 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.049760 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 23:54:44.049977 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.049765 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 23:54:44.061017 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.060990 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz"] Apr 24 23:54:44.092966 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.092941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42b467d5-fece-454f-a00a-550cc3a4e698-tmp\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.093058 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.092980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/42b467d5-fece-454f-a00a-550cc3a4e698-klusterlet-config\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.093058 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.093006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wftvj\" (UniqueName: \"kubernetes.io/projected/42b467d5-fece-454f-a00a-550cc3a4e698-kube-api-access-wftvj\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.194070 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.194035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42b467d5-fece-454f-a00a-550cc3a4e698-tmp\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.194070 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.194074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/42b467d5-fece-454f-a00a-550cc3a4e698-klusterlet-config\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.194347 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.194097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wftvj\" (UniqueName: \"kubernetes.io/projected/42b467d5-fece-454f-a00a-550cc3a4e698-kube-api-access-wftvj\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.194529 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.194504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42b467d5-fece-454f-a00a-550cc3a4e698-tmp\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.196505 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.196487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/42b467d5-fece-454f-a00a-550cc3a4e698-klusterlet-config\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.203098 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.203074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wftvj\" (UniqueName: \"kubernetes.io/projected/42b467d5-fece-454f-a00a-550cc3a4e698-kube-api-access-wftvj\") pod \"klusterlet-addon-workmgr-84f9bc9448-7qqkz\" (UID: \"42b467d5-fece-454f-a00a-550cc3a4e698\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.354454 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.354334 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:44.465324 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.465291 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz"] Apr 24 23:54:44.468418 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:54:44.468385 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b467d5_fece_454f_a00a_550cc3a4e698.slice/crio-9606f317b862144c7effcd89b20fe11e4b2bdef7943ae2ff894219f0a7dd4e5d WatchSource:0}: Error finding container 9606f317b862144c7effcd89b20fe11e4b2bdef7943ae2ff894219f0a7dd4e5d: Status 404 returned error can't find the container with id 9606f317b862144c7effcd89b20fe11e4b2bdef7943ae2ff894219f0a7dd4e5d Apr 24 23:54:44.710175 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.710140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" event={"ID":"42b467d5-fece-454f-a00a-550cc3a4e698","Type":"ContainerStarted","Data":"9606f317b862144c7effcd89b20fe11e4b2bdef7943ae2ff894219f0a7dd4e5d"} Apr 24 23:54:44.711213 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.711185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-clwv5" event={"ID":"eb26be0d-42dc-4350-8240-8da8402a51a3","Type":"ContainerStarted","Data":"b888a4cdbafc61ae5b18df9298331e035e247c20ebe751b34ae8d7c20d58a52a"} Apr 24 23:54:44.711453 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.711436 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:54:44.729561 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:44.729514 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-clwv5" podStartSLOduration=66.13069296 podStartE2EDuration="1m8.72949845s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:54:41.472823397 +0000 UTC m=+65.626436927" lastFinishedPulling="2026-04-24 23:54:44.071628883 +0000 UTC m=+68.225242417" observedRunningTime="2026-04-24 23:54:44.728355682 +0000 UTC m=+68.881969234" watchObservedRunningTime="2026-04-24 23:54:44.72949845 +0000 UTC m=+68.883112000" Apr 24 23:54:48.720536 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:48.720496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" event={"ID":"42b467d5-fece-454f-a00a-550cc3a4e698","Type":"ContainerStarted","Data":"2e96246a112cf931aa217bba7178588e7d88809cb76743fb2035ca6f5370d745"} Apr 24 23:54:48.720906 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:48.720715 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:48.722407 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:48.722388 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:54:48.737218 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:54:48.737172 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" podStartSLOduration=1.4431028399999999 podStartE2EDuration="4.737159331s" podCreationTimestamp="2026-04-24 23:54:44 +0000 UTC" firstStartedPulling="2026-04-24 23:54:44.470129615 +0000 UTC m=+68.623743145" lastFinishedPulling="2026-04-24 23:54:47.764186096 +0000 UTC m=+71.917799636" observedRunningTime="2026-04-24 23:54:48.736382327 +0000 UTC m=+72.889995879" watchObservedRunningTime="2026-04-24 23:54:48.737159331 +0000 UTC m=+72.890772881" Apr 24 23:55:12.591892 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:55:12.591837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:55:12.591892 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:55:12.591902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:55:12.592421 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:55:12.592007 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:55:12.592421 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:55:12.592040 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:55:12.592421 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:55:12.592083 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:16.592064368 +0000 UTC m=+160.745677900 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:55:12.592421 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:55:12.592099 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:16.592092893 +0000 UTC m=+160.745706426 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:55:15.716318 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:55:15.716283 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-clwv5" Apr 24 23:55:45.120218 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:55:45.120161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:55:45.120718 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:55:45.120327 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:55:45.120718 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:55:45.120421 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs podName:148a2391-987d-4318-b295-01018903ff94 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:47.120402641 +0000 UTC m=+251.274016188 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs") pod "network-metrics-daemon-xvbxz" (UID: "148a2391-987d-4318-b295-01018903ff94") : secret "metrics-daemon-secret" not found Apr 24 23:56:00.096536 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.096499 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk"] Apr 24 23:56:00.099182 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.099161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.101640 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.101596 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 23:56:00.103028 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.103009 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:56:00.103143 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.103015 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:56:00.104370 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.104347 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bdd6m\"" Apr 24 23:56:00.104475 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.104395 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 23:56:00.108408 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.108386 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk"] Apr 24 23:56:00.228757 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.228719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx5f9\" (UniqueName: \"kubernetes.io/projected/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-kube-api-access-sx5f9\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.228960 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.228795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.228960 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.228848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.329233 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.329189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sx5f9\" (UniqueName: \"kubernetes.io/projected/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-kube-api-access-sx5f9\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.329347 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.329252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.329347 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.329305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.329433 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:00.329416 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:00.329499 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:00.329489 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls podName:ead2f3e3-ce6d-4077-a58c-391d75b4bb6c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:00.829469138 +0000 UTC m=+144.983082666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fqcfk" (UID: "ead2f3e3-ce6d-4077-a58c-391d75b4bb6c") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:00.330066 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.330047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.337960 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.337940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx5f9\" (UniqueName: \"kubernetes.io/projected/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-kube-api-access-sx5f9\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.832907 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:00.832866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:00.833095 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:00.832989 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:00.833095 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:00.833045 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls podName:ead2f3e3-ce6d-4077-a58c-391d75b4bb6c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.833031152 +0000 UTC m=+145.986644680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fqcfk" (UID: "ead2f3e3-ce6d-4077-a58c-391d75b4bb6c") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:01.841187 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:01.841122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:01.841587 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:01.841291 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:01.841587 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:01.841362 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls podName:ead2f3e3-ce6d-4077-a58c-391d75b4bb6c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:03.841345284 +0000 UTC m=+147.994958813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fqcfk" (UID: "ead2f3e3-ce6d-4077-a58c-391d75b4bb6c") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:03.857518 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:03.857467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:03.857948 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:03.857596 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:03.857948 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:03.857660 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls podName:ead2f3e3-ce6d-4077-a58c-391d75b4bb6c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:07.857645519 +0000 UTC m=+152.011259048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fqcfk" (UID: "ead2f3e3-ce6d-4077-a58c-391d75b4bb6c") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:05.296455 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:05.296419 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rpxzt_60d87af7-f253-4e76-9605-d6707237c596/dns-node-resolver/0.log" Apr 24 23:56:06.091156 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:06.091128 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7662p_55b4791c-ab54-4f79-a22b-f9adb92a1461/node-ca/0.log" Apr 24 23:56:07.887715 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:07.887666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:07.888133 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:07.887835 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:07.888133 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:07.887901 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls podName:ead2f3e3-ce6d-4077-a58c-391d75b4bb6c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:15.887885584 +0000 UTC m=+160.041499113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fqcfk" (UID: "ead2f3e3-ce6d-4077-a58c-391d75b4bb6c") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:09.954028 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.953997 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq"] Apr 24 23:56:09.957166 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.957142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:09.957892 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.957868 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m"] Apr 24 23:56:09.959896 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.959878 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-p8zvl\"" Apr 24 23:56:09.960714 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.960696 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:09.961209 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.961185 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 23:56:09.961320 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.961230 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 23:56:09.961320 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.961246 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:56:09.961320 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.961286 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 23:56:09.963805 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.963768 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 23:56:09.964052 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.964031 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 23:56:09.964121 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.964054 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:56:09.964121 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.964073 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-szzhm\"" Apr 24 23:56:09.964272 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.964253 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 23:56:09.967722 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.967702 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq"] Apr 24 23:56:09.969915 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:09.969894 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m"] Apr 24 23:56:10.057359 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.057326 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wlwd7"] Apr 24 23:56:10.060133 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.060118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.062970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.062948 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 23:56:10.062970 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.062967 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vtxxt\"" Apr 24 23:56:10.063198 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.063184 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 23:56:10.063364 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.063349 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 23:56:10.063433 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.063351 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:56:10.068602 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.068582 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 23:56:10.069716 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.069694 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wlwd7"] Apr 24 23:56:10.103005 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.102968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfs47\" (UniqueName: \"kubernetes.io/projected/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-kube-api-access-tfs47\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.103005 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.103015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79704e7-d789-4f4c-8f4b-b4183bea75dd-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.103258 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.103075 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79704e7-d789-4f4c-8f4b-b4183bea75dd-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.103258 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.103110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.103258 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.103168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khpd8\" (UniqueName: \"kubernetes.io/projected/f79704e7-d789-4f4c-8f4b-b4183bea75dd-kube-api-access-khpd8\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.103258 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.103211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-config\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.204534 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79704e7-d789-4f4c-8f4b-b4183bea75dd-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.204534 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204522 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4bdc38e-3787-4844-9d7a-323427247405-config\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.204703 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4bdc38e-3787-4844-9d7a-323427247405-serving-cert\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.204703 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn57t\" (UniqueName: \"kubernetes.io/projected/c4bdc38e-3787-4844-9d7a-323427247405-kube-api-access-rn57t\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.204703 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.204703 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khpd8\" (UniqueName: \"kubernetes.io/projected/f79704e7-d789-4f4c-8f4b-b4183bea75dd-kube-api-access-khpd8\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.204703 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-config\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.204979 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4bdc38e-3787-4844-9d7a-323427247405-trusted-ca\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.204979 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfs47\" (UniqueName: \"kubernetes.io/projected/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-kube-api-access-tfs47\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.204979 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.204840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79704e7-d789-4f4c-8f4b-b4183bea75dd-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.205327 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.205304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-config\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.205450 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.205427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79704e7-d789-4f4c-8f4b-b4183bea75dd-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.206760 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.206740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.206826 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.206740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79704e7-d789-4f4c-8f4b-b4183bea75dd-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.213257 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.213235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khpd8\" (UniqueName: \"kubernetes.io/projected/f79704e7-d789-4f4c-8f4b-b4183bea75dd-kube-api-access-khpd8\") pod \"kube-storage-version-migrator-operator-6769c5d45-tgx7m\" (UID: \"f79704e7-d789-4f4c-8f4b-b4183bea75dd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.213347 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.213267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfs47\" (UniqueName: \"kubernetes.io/projected/0ff7fe59-d120-4f53-9df4-e0b4aa229a5e-kube-api-access-tfs47\") pod \"service-ca-operator-d6fc45fc5-xqmlq\" (UID: \"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.269758 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.269705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" Apr 24 23:56:10.275508 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.275477 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" Apr 24 23:56:10.305561 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.305525 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4bdc38e-3787-4844-9d7a-323427247405-config\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.305561 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.305562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4bdc38e-3787-4844-9d7a-323427247405-serving-cert\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.305845 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.305594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn57t\" (UniqueName: \"kubernetes.io/projected/c4bdc38e-3787-4844-9d7a-323427247405-kube-api-access-rn57t\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.305845 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.305660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4bdc38e-3787-4844-9d7a-323427247405-trusted-ca\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.306900 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.306845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4bdc38e-3787-4844-9d7a-323427247405-trusted-ca\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.307071 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.307044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4bdc38e-3787-4844-9d7a-323427247405-config\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.309709 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.309670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4bdc38e-3787-4844-9d7a-323427247405-serving-cert\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.314286 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.314246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn57t\" (UniqueName: \"kubernetes.io/projected/c4bdc38e-3787-4844-9d7a-323427247405-kube-api-access-rn57t\") pod \"console-operator-9d4b6777b-wlwd7\" (UID: \"c4bdc38e-3787-4844-9d7a-323427247405\") " pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.369146 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.369072 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:10.399596 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.399559 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m"] Apr 24 23:56:10.404364 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:10.404333 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf79704e7_d789_4f4c_8f4b_b4183bea75dd.slice/crio-e356e0d26fe8985699d0005d5daa39eafd6b0e19abbeffb36cf26280a3836c98 WatchSource:0}: Error finding container e356e0d26fe8985699d0005d5daa39eafd6b0e19abbeffb36cf26280a3836c98: Status 404 returned error can't find the container with id e356e0d26fe8985699d0005d5daa39eafd6b0e19abbeffb36cf26280a3836c98 Apr 24 23:56:10.415751 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.415717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq"] Apr 24 23:56:10.418739 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:10.418710 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ff7fe59_d120_4f53_9df4_e0b4aa229a5e.slice/crio-04798ae8e86215fc4669c1b62f7bd7048e8a6e3c6585b2839c8b3e782e3f16c9 WatchSource:0}: Error finding container 04798ae8e86215fc4669c1b62f7bd7048e8a6e3c6585b2839c8b3e782e3f16c9: Status 404 returned error can't find the container with id 04798ae8e86215fc4669c1b62f7bd7048e8a6e3c6585b2839c8b3e782e3f16c9 Apr 24 23:56:10.493083 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.493007 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wlwd7"] Apr 24 23:56:10.495967 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:10.495941 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4bdc38e_3787_4844_9d7a_323427247405.slice/crio-4efeda376a8a69e52288eecfa31d0a6e73a7d691e1d1198bd71ce6b87185458c WatchSource:0}: Error finding container 4efeda376a8a69e52288eecfa31d0a6e73a7d691e1d1198bd71ce6b87185458c: Status 404 returned error can't find the container with id 4efeda376a8a69e52288eecfa31d0a6e73a7d691e1d1198bd71ce6b87185458c Apr 24 23:56:10.880621 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.880542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" event={"ID":"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e","Type":"ContainerStarted","Data":"04798ae8e86215fc4669c1b62f7bd7048e8a6e3c6585b2839c8b3e782e3f16c9"} Apr 24 23:56:10.882077 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.881958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" event={"ID":"f79704e7-d789-4f4c-8f4b-b4183bea75dd","Type":"ContainerStarted","Data":"e356e0d26fe8985699d0005d5daa39eafd6b0e19abbeffb36cf26280a3836c98"} Apr 24 23:56:10.883512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:10.883480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" event={"ID":"c4bdc38e-3787-4844-9d7a-323427247405","Type":"ContainerStarted","Data":"4efeda376a8a69e52288eecfa31d0a6e73a7d691e1d1198bd71ce6b87185458c"} Apr 24 23:56:11.760234 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:11.760188 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6kzrg" podUID="f5e433d1-e134-4312-9418-0c609e10c09c" Apr 24 23:56:11.783633 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:11.783587 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-c5wq4" podUID="efeb0526-c248-4f97-ad71-12762132cd18" Apr 24 23:56:11.885955 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:11.885923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kzrg" Apr 24 23:56:13.419405 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:13.419363 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-xvbxz" podUID="148a2391-987d-4318-b295-01018903ff94" Apr 24 23:56:13.891962 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:13.891884 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/0.log" Apr 24 23:56:13.891962 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:13.891932 2575 generic.go:358] "Generic (PLEG): container finished" podID="c4bdc38e-3787-4844-9d7a-323427247405" containerID="e26592a2e93384f0df7572c18ae9b2adf8be8ce4c622e4e264b9614cf2363490" exitCode=255 Apr 24 23:56:13.892275 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:13.892029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" event={"ID":"c4bdc38e-3787-4844-9d7a-323427247405","Type":"ContainerDied","Data":"e26592a2e93384f0df7572c18ae9b2adf8be8ce4c622e4e264b9614cf2363490"} Apr 24 23:56:13.892275 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:13.892254 2575 scope.go:117] "RemoveContainer" containerID="e26592a2e93384f0df7572c18ae9b2adf8be8ce4c622e4e264b9614cf2363490" Apr 24 23:56:13.893439 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:13.893411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" event={"ID":"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e","Type":"ContainerStarted","Data":"d8e6c27a62263942c10fa3f54660277a2809301824eb4b89f78745a4c795487f"} Apr 24 23:56:13.894656 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:13.894635 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" event={"ID":"f79704e7-d789-4f4c-8f4b-b4183bea75dd","Type":"ContainerStarted","Data":"d6674efcd22995292800e76fd8531614c153c102b15106a2fe85038ebb927da1"} Apr 24 23:56:13.928907 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:13.928849 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" podStartSLOduration=2.273213476 podStartE2EDuration="4.928831863s" podCreationTimestamp="2026-04-24 23:56:09 +0000 UTC" firstStartedPulling="2026-04-24 23:56:10.406280169 +0000 UTC m=+154.559893704" lastFinishedPulling="2026-04-24 23:56:13.061898562 +0000 UTC m=+157.215512091" observedRunningTime="2026-04-24 23:56:13.928069206 +0000 UTC m=+158.081682767" watchObservedRunningTime="2026-04-24 23:56:13.928831863 +0000 UTC m=+158.082445415" Apr 24 23:56:13.942406 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:13.942351 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" podStartSLOduration=2.296518876 podStartE2EDuration="4.942331898s" podCreationTimestamp="2026-04-24 23:56:09 +0000 UTC" firstStartedPulling="2026-04-24 23:56:10.422388565 +0000 UTC m=+154.576002094" lastFinishedPulling="2026-04-24 23:56:13.068201586 +0000 UTC m=+157.221815116" observedRunningTime="2026-04-24 23:56:13.942093508 +0000 UTC m=+158.095707061" watchObservedRunningTime="2026-04-24 23:56:13.942331898 +0000 UTC m=+158.095945469" Apr 24 23:56:14.423600 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.423556 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww"] Apr 24 23:56:14.426589 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.426570 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" Apr 24 23:56:14.429485 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.429460 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 23:56:14.429610 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.429462 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-wzqgx\"" Apr 24 23:56:14.430419 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.430402 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 23:56:14.433704 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.433681 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww"] Apr 24 23:56:14.544063 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.543979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjhh\" (UniqueName: \"kubernetes.io/projected/19ae70e3-42b5-45c3-9397-8982cf99e3ac-kube-api-access-vsjhh\") pod \"migrator-74bb7799d9-c58ww\" (UID: \"19ae70e3-42b5-45c3-9397-8982cf99e3ac\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" Apr 24 23:56:14.644520 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.644487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjhh\" (UniqueName: \"kubernetes.io/projected/19ae70e3-42b5-45c3-9397-8982cf99e3ac-kube-api-access-vsjhh\") pod \"migrator-74bb7799d9-c58ww\" (UID: \"19ae70e3-42b5-45c3-9397-8982cf99e3ac\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" Apr 24 23:56:14.655772 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.655743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjhh\" (UniqueName: \"kubernetes.io/projected/19ae70e3-42b5-45c3-9397-8982cf99e3ac-kube-api-access-vsjhh\") pod \"migrator-74bb7799d9-c58ww\" (UID: \"19ae70e3-42b5-45c3-9397-8982cf99e3ac\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" Apr 24 23:56:14.736839 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.736719 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" Apr 24 23:56:14.853380 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.853346 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww"] Apr 24 23:56:14.856586 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:14.856562 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19ae70e3_42b5_45c3_9397_8982cf99e3ac.slice/crio-f7fa2b6cb23afe0653a419da33219d1fdfebfbf78ae0b8306e0eb674baaf9685 WatchSource:0}: Error finding container f7fa2b6cb23afe0653a419da33219d1fdfebfbf78ae0b8306e0eb674baaf9685: Status 404 returned error can't find the container with id f7fa2b6cb23afe0653a419da33219d1fdfebfbf78ae0b8306e0eb674baaf9685 Apr 24 23:56:14.899004 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.898963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" event={"ID":"19ae70e3-42b5-45c3-9397-8982cf99e3ac","Type":"ContainerStarted","Data":"f7fa2b6cb23afe0653a419da33219d1fdfebfbf78ae0b8306e0eb674baaf9685"} Apr 24 23:56:14.900279 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.900259 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 24 23:56:14.900600 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.900587 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/0.log" Apr 24 23:56:14.900679 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.900622 2575 generic.go:358] "Generic (PLEG): container finished" podID="c4bdc38e-3787-4844-9d7a-323427247405" containerID="8aef80393b2e4cbd185a0f0d40facd4837294e43f3dcace5404a5f14fdd670e1" exitCode=255 Apr 24 23:56:14.900728 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.900702 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" event={"ID":"c4bdc38e-3787-4844-9d7a-323427247405","Type":"ContainerDied","Data":"8aef80393b2e4cbd185a0f0d40facd4837294e43f3dcace5404a5f14fdd670e1"} Apr 24 23:56:14.900768 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.900738 2575 scope.go:117] "RemoveContainer" containerID="e26592a2e93384f0df7572c18ae9b2adf8be8ce4c622e4e264b9614cf2363490" Apr 24 23:56:14.901020 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:14.900991 2575 scope.go:117] "RemoveContainer" containerID="8aef80393b2e4cbd185a0f0d40facd4837294e43f3dcace5404a5f14fdd670e1" Apr 24 23:56:14.901235 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:14.901213 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wlwd7_openshift-console-operator(c4bdc38e-3787-4844-9d7a-323427247405)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" podUID="c4bdc38e-3787-4844-9d7a-323427247405" Apr 24 23:56:15.904565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:15.904536 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 24 23:56:15.905070 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:15.905044 2575 scope.go:117] "RemoveContainer" containerID="8aef80393b2e4cbd185a0f0d40facd4837294e43f3dcace5404a5f14fdd670e1" Apr 24 23:56:15.905256 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:15.905233 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wlwd7_openshift-console-operator(c4bdc38e-3787-4844-9d7a-323427247405)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" podUID="c4bdc38e-3787-4844-9d7a-323427247405" Apr 24 23:56:15.906272 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:15.906247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" event={"ID":"19ae70e3-42b5-45c3-9397-8982cf99e3ac","Type":"ContainerStarted","Data":"8f5c9a6340b2c400d5b84b0e47c502494a02356b6e3ce0c762367ed1aee25a7d"} Apr 24 23:56:15.906386 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:15.906277 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" event={"ID":"19ae70e3-42b5-45c3-9397-8982cf99e3ac","Type":"ContainerStarted","Data":"5f7ce50688c9e7f0893c1ad5facfd98303b8ba468d4d34b76aa167630151193f"} Apr 24 23:56:15.936470 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:15.936420 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c58ww" podStartSLOduration=1.084326998 podStartE2EDuration="1.93640488s" podCreationTimestamp="2026-04-24 23:56:14 +0000 UTC" firstStartedPulling="2026-04-24 23:56:14.858312441 +0000 UTC m=+159.011925970" lastFinishedPulling="2026-04-24 23:56:15.710390319 +0000 UTC m=+159.864003852" observedRunningTime="2026-04-24 23:56:15.935485768 +0000 UTC m=+160.089099320" watchObservedRunningTime="2026-04-24 23:56:15.93640488 +0000 UTC m=+160.090018464" Apr 24 23:56:15.955989 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:15.955950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:15.956152 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:15.956094 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:15.956187 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:15.956160 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls podName:ead2f3e3-ce6d-4077-a58c-391d75b4bb6c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:31.956145082 +0000 UTC m=+176.109758612 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fqcfk" (UID: "ead2f3e3-ce6d-4077-a58c-391d75b4bb6c") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:16.660015 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.659986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:56:16.660015 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.660022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:56:16.660325 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:16.660129 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:56:16.660325 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:16.660188 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls podName:f5e433d1-e134-4312-9418-0c609e10c09c nodeName:}" failed. No retries permitted until 2026-04-24 23:58:18.660172717 +0000 UTC m=+282.813786246 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls") pod "dns-default-6kzrg" (UID: "f5e433d1-e134-4312-9418-0c609e10c09c") : secret "dns-default-metrics-tls" not found Apr 24 23:56:16.660325 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:16.660134 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:56:16.660325 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:16.660250 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert podName:efeb0526-c248-4f97-ad71-12762132cd18 nodeName:}" failed. No retries permitted until 2026-04-24 23:58:18.660235139 +0000 UTC m=+282.813848668 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert") pod "ingress-canary-c5wq4" (UID: "efeb0526-c248-4f97-ad71-12762132cd18") : secret "canary-serving-cert" not found Apr 24 23:56:16.700020 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.699989 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb"] Apr 24 23:56:16.702551 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.702533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb" Apr 24 23:56:16.705992 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.705975 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-q4rpl\"" Apr 24 23:56:16.713910 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.713883 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb"] Apr 24 23:56:16.761171 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.761139 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9l2gf"] Apr 24 23:56:16.764370 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.764353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.776033 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.776011 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:56:16.776033 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.776021 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-l8vsl\"" Apr 24 23:56:16.776341 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.776327 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:56:16.780522 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.780500 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:56:16.781812 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.781791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9l2gf"] Apr 24 23:56:16.787593 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.787564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:56:16.861322 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.861283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-crio-socket\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.861490 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.861353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh6dr\" (UniqueName: \"kubernetes.io/projected/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-kube-api-access-fh6dr\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.861490 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.861407 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-data-volume\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.861565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.861492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjq56\" (UniqueName: \"kubernetes.io/projected/9f47c623-583e-4112-8494-6b034149be3a-kube-api-access-hjq56\") pod \"network-check-source-8894fc9bd-v4tzb\" (UID: \"9f47c623-583e-4112-8494-6b034149be3a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb" Apr 24 23:56:16.861565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.861552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.861627 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.861600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.962729 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.962623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-data-volume\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.962729 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.962711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjq56\" (UniqueName: \"kubernetes.io/projected/9f47c623-583e-4112-8494-6b034149be3a-kube-api-access-hjq56\") pod \"network-check-source-8894fc9bd-v4tzb\" (UID: \"9f47c623-583e-4112-8494-6b034149be3a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb" Apr 24 23:56:16.963249 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.962763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.963249 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:16.962893 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:16.963249 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.962951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.963249 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:16.962974 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls podName:dd2d64ec-40bf-420a-8983-c9eb0c1bb070 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:17.462953506 +0000 UTC m=+161.616567036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9l2gf" (UID: "dd2d64ec-40bf-420a-8983-c9eb0c1bb070") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:16.963249 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.962989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-data-volume\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.963249 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.963013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-crio-socket\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.963249 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.963057 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-crio-socket\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.963249 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.963082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh6dr\" (UniqueName: \"kubernetes.io/projected/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-kube-api-access-fh6dr\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.963503 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.963353 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.972904 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.972871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh6dr\" (UniqueName: \"kubernetes.io/projected/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-kube-api-access-fh6dr\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:16.973025 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:16.972966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjq56\" (UniqueName: \"kubernetes.io/projected/9f47c623-583e-4112-8494-6b034149be3a-kube-api-access-hjq56\") pod \"network-check-source-8894fc9bd-v4tzb\" (UID: \"9f47c623-583e-4112-8494-6b034149be3a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb" Apr 24 23:56:17.010903 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:17.010867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb" Apr 24 23:56:17.134982 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:17.134949 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb"] Apr 24 23:56:17.137678 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:17.137645 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f47c623_583e_4112_8494_6b034149be3a.slice/crio-09ed67b0785f37d925ec1b2888d2e85b777263184bbda25cece9aca78edbfc80 WatchSource:0}: Error finding container 09ed67b0785f37d925ec1b2888d2e85b777263184bbda25cece9aca78edbfc80: Status 404 returned error can't find the container with id 09ed67b0785f37d925ec1b2888d2e85b777263184bbda25cece9aca78edbfc80 Apr 24 23:56:17.467699 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:17.467664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:17.467909 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:17.467833 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:17.467954 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:17.467919 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls podName:dd2d64ec-40bf-420a-8983-c9eb0c1bb070 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:18.467898109 +0000 UTC m=+162.621511641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9l2gf" (UID: "dd2d64ec-40bf-420a-8983-c9eb0c1bb070") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:17.912407 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:17.912374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb" event={"ID":"9f47c623-583e-4112-8494-6b034149be3a","Type":"ContainerStarted","Data":"e765aa0d33d844582414e4a626c8f08303ed8988595f3138ea115395bef67809"} Apr 24 23:56:17.912407 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:17.912411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb" event={"ID":"9f47c623-583e-4112-8494-6b034149be3a","Type":"ContainerStarted","Data":"09ed67b0785f37d925ec1b2888d2e85b777263184bbda25cece9aca78edbfc80"} Apr 24 23:56:17.934861 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:17.934808 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v4tzb" podStartSLOduration=1.9347737170000001 podStartE2EDuration="1.934773717s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:17.934253679 +0000 UTC m=+162.087867231" watchObservedRunningTime="2026-04-24 23:56:17.934773717 +0000 UTC m=+162.088387273" Apr 24 23:56:18.474620 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:18.474576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:18.475055 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:18.474729 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:18.475055 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:18.474814 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls podName:dd2d64ec-40bf-420a-8983-c9eb0c1bb070 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:20.474797456 +0000 UTC m=+164.628410985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9l2gf" (UID: "dd2d64ec-40bf-420a-8983-c9eb0c1bb070") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:20.369591 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:20.369547 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:20.369591 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:20.369590 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:20.370160 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:20.370074 2575 scope.go:117] "RemoveContainer" containerID="8aef80393b2e4cbd185a0f0d40facd4837294e43f3dcace5404a5f14fdd670e1" Apr 24 23:56:20.370304 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:20.370282 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wlwd7_openshift-console-operator(c4bdc38e-3787-4844-9d7a-323427247405)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" podUID="c4bdc38e-3787-4844-9d7a-323427247405" Apr 24 23:56:20.488307 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:20.488271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:20.488472 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:20.488414 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:20.488511 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:20.488489 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls podName:dd2d64ec-40bf-420a-8983-c9eb0c1bb070 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:24.488472228 +0000 UTC m=+168.642085756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9l2gf" (UID: "dd2d64ec-40bf-420a-8983-c9eb0c1bb070") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:23.410135 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:23.410086 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:56:24.410361 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:24.410327 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:56:24.518051 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:24.518013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:24.518216 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:24.518159 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:24.518257 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:24.518232 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls podName:dd2d64ec-40bf-420a-8983-c9eb0c1bb070 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:32.518215135 +0000 UTC m=+176.671828668 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9l2gf" (UID: "dd2d64ec-40bf-420a-8983-c9eb0c1bb070") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:31.978209 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:31.978110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:31.980514 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:31.980493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ead2f3e3-ce6d-4077-a58c-391d75b4bb6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fqcfk\" (UID: \"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:32.209111 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.209056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" Apr 24 23:56:32.323939 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.323907 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk"] Apr 24 23:56:32.327126 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:32.327097 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead2f3e3_ce6d_4077_a58c_391d75b4bb6c.slice/crio-37543488af6bfcca26a7fe02baf459d3b11d56a1b161960586496b149321221e WatchSource:0}: Error finding container 37543488af6bfcca26a7fe02baf459d3b11d56a1b161960586496b149321221e: Status 404 returned error can't find the container with id 37543488af6bfcca26a7fe02baf459d3b11d56a1b161960586496b149321221e Apr 24 23:56:32.410212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.410182 2575 scope.go:117] "RemoveContainer" containerID="8aef80393b2e4cbd185a0f0d40facd4837294e43f3dcace5404a5f14fdd670e1" Apr 24 23:56:32.583389 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.583283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:32.585578 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.585556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd2d64ec-40bf-420a-8983-c9eb0c1bb070-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9l2gf\" (UID: \"dd2d64ec-40bf-420a-8983-c9eb0c1bb070\") " pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:32.672413 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.672377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9l2gf" Apr 24 23:56:32.817290 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.817256 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9l2gf"] Apr 24 23:56:32.820641 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:32.820608 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd2d64ec_40bf_420a_8983_c9eb0c1bb070.slice/crio-2cb1a24f6fdf2b0fb22a40109e1f76354bbfd5799c58be181b101ff5cb950d06 WatchSource:0}: Error finding container 2cb1a24f6fdf2b0fb22a40109e1f76354bbfd5799c58be181b101ff5cb950d06: Status 404 returned error can't find the container with id 2cb1a24f6fdf2b0fb22a40109e1f76354bbfd5799c58be181b101ff5cb950d06 Apr 24 23:56:32.949027 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.948996 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 24 23:56:32.949210 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.949099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" event={"ID":"c4bdc38e-3787-4844-9d7a-323427247405","Type":"ContainerStarted","Data":"73ca434a66b6c6bb3af66282e775394acf84a2d04178233b4ff1fa0b57819fef"} Apr 24 23:56:32.949679 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.949420 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:32.950926 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.950892 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9l2gf" event={"ID":"dd2d64ec-40bf-420a-8983-c9eb0c1bb070","Type":"ContainerStarted","Data":"f595579e6a75924acacadbb39032a07ecffd01bfb73c68785d3ba0ec7935f07e"} Apr 24 23:56:32.951051 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.950934 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9l2gf" event={"ID":"dd2d64ec-40bf-420a-8983-c9eb0c1bb070","Type":"ContainerStarted","Data":"2cb1a24f6fdf2b0fb22a40109e1f76354bbfd5799c58be181b101ff5cb950d06"} Apr 24 23:56:32.952104 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.952074 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" event={"ID":"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c","Type":"ContainerStarted","Data":"37543488af6bfcca26a7fe02baf459d3b11d56a1b161960586496b149321221e"} Apr 24 23:56:32.955127 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.955103 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" Apr 24 23:56:32.968160 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:32.968111 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-wlwd7" podStartSLOduration=20.401789049 podStartE2EDuration="22.968093086s" podCreationTimestamp="2026-04-24 23:56:10 +0000 UTC" firstStartedPulling="2026-04-24 23:56:10.497599406 +0000 UTC m=+154.651212936" lastFinishedPulling="2026-04-24 23:56:13.06390343 +0000 UTC m=+157.217516973" observedRunningTime="2026-04-24 23:56:32.966641163 +0000 UTC m=+177.120254712" watchObservedRunningTime="2026-04-24 23:56:32.968093086 +0000 UTC m=+177.121706627" Apr 24 23:56:33.956881 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:33.956846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9l2gf" event={"ID":"dd2d64ec-40bf-420a-8983-c9eb0c1bb070","Type":"ContainerStarted","Data":"7b7292369f82c486b13e900e101f0e6471b12e6c136340bbf53fcf9c889291b8"} Apr 24 23:56:33.958340 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:33.958301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" event={"ID":"ead2f3e3-ce6d-4077-a58c-391d75b4bb6c","Type":"ContainerStarted","Data":"338010d7260ade05bdd178d470b04aff89321d138578726cf81929a28e874467"} Apr 24 23:56:33.976114 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:33.976069 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fqcfk" podStartSLOduration=32.444822812 podStartE2EDuration="33.976054529s" podCreationTimestamp="2026-04-24 23:56:00 +0000 UTC" firstStartedPulling="2026-04-24 23:56:32.329271228 +0000 UTC m=+176.482884758" lastFinishedPulling="2026-04-24 23:56:33.860502938 +0000 UTC m=+178.014116475" observedRunningTime="2026-04-24 23:56:33.975920307 +0000 UTC m=+178.129533858" watchObservedRunningTime="2026-04-24 23:56:33.976054529 +0000 UTC m=+178.129668080" Apr 24 23:56:35.965738 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:35.965698 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9l2gf" event={"ID":"dd2d64ec-40bf-420a-8983-c9eb0c1bb070","Type":"ContainerStarted","Data":"25e505202c0477da7e1c3bfe601ef1267520a1255f1487a4859df3a4ec8af017"} Apr 24 23:56:40.241348 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.241291 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9l2gf" podStartSLOduration=21.888207342 podStartE2EDuration="24.241271801s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="2026-04-24 23:56:32.880360552 +0000 UTC m=+177.033974100" lastFinishedPulling="2026-04-24 23:56:35.233425026 +0000 UTC m=+179.387038559" observedRunningTime="2026-04-24 23:56:35.992703621 +0000 UTC m=+180.146317171" watchObservedRunningTime="2026-04-24 23:56:40.241271801 +0000 UTC m=+184.394885352" Apr 24 23:56:40.241743 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.241564 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-588b569ff5-gxtpm"] Apr 24 23:56:40.244968 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.244950 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.247628 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.247586 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:56:40.247935 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.247914 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpz6k\"" Apr 24 23:56:40.248038 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.247959 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:56:40.249283 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.249266 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:56:40.254566 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.254544 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:56:40.260818 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.260769 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-588b569ff5-gxtpm"] Apr 24 23:56:40.348728 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.348690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb654\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-kube-api-access-fb654\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.348909 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.348733 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-registry-certificates\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.348909 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.348841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-bound-sa-token\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.348909 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.348873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-trusted-ca\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.348909 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.348899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-ca-trust-extracted\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.349211 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.348948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-image-registry-private-configuration\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.349211 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.348980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-installation-pull-secrets\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.349211 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.349047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-registry-tls\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.449519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.449476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-bound-sa-token\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.449519 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.449522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-trusted-ca\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.449814 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.449556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-ca-trust-extracted\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.449814 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.449594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-image-registry-private-configuration\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.449814 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.449621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-installation-pull-secrets\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.449814 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.449753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-registry-tls\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.450028 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.449861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fb654\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-kube-api-access-fb654\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.450028 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.449890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-registry-certificates\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.450136 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.450072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-ca-trust-extracted\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.450643 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.450619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-registry-certificates\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.450849 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.450830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-trusted-ca\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.452209 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.452192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-installation-pull-secrets\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.452481 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.452453 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-image-registry-private-configuration\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.452685 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.452664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-registry-tls\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.457449 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.457428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb654\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-kube-api-access-fb654\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.457551 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.457469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0f5a613-6fd5-430e-ad23-4b5e9282d16b-bound-sa-token\") pod \"image-registry-588b569ff5-gxtpm\" (UID: \"f0f5a613-6fd5-430e-ad23-4b5e9282d16b\") " pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.554707 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.554614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:40.676461 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.676423 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-588b569ff5-gxtpm"] Apr 24 23:56:40.679098 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:40.679064 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f5a613_6fd5_430e_ad23_4b5e9282d16b.slice/crio-7988369d8f0a09f12560b66d4126fa00bbc03ca875e9acf6a917f1c934742a6c WatchSource:0}: Error finding container 7988369d8f0a09f12560b66d4126fa00bbc03ca875e9acf6a917f1c934742a6c: Status 404 returned error can't find the container with id 7988369d8f0a09f12560b66d4126fa00bbc03ca875e9acf6a917f1c934742a6c Apr 24 23:56:40.980016 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.979981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" event={"ID":"f0f5a613-6fd5-430e-ad23-4b5e9282d16b","Type":"ContainerStarted","Data":"2a7e8487eb9acc815fdb0906fdbf1a62c6d3f74f10d3a9582e4676d4ab4f6978"} Apr 24 23:56:40.980179 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.980022 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" event={"ID":"f0f5a613-6fd5-430e-ad23-4b5e9282d16b","Type":"ContainerStarted","Data":"7988369d8f0a09f12560b66d4126fa00bbc03ca875e9acf6a917f1c934742a6c"} Apr 24 23:56:40.980179 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:40.980053 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:56:41.014796 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:41.014729 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" podStartSLOduration=1.014712517 podStartE2EDuration="1.014712517s" podCreationTimestamp="2026-04-24 23:56:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:41.01399785 +0000 UTC m=+185.167611414" watchObservedRunningTime="2026-04-24 23:56:41.014712517 +0000 UTC m=+185.168326067" Apr 24 23:56:47.747474 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.747439 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h"] Apr 24 23:56:47.769589 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.769557 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wjnsp"] Apr 24 23:56:47.769837 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.769717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:47.772175 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.772148 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 23:56:47.772296 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.772155 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-fklm5\"" Apr 24 23:56:47.772590 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.772573 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 23:56:47.773270 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.773252 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:56:47.781383 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.781364 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h"] Apr 24 23:56:47.781479 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.781469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.783924 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.783902 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:56:47.784026 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.783940 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:56:47.784026 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.783975 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7cj9k\"" Apr 24 23:56:47.784026 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.784010 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:56:47.909728 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.909692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-root\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.909966 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.909750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:47.909966 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.909807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:47.909966 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.909842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-accelerators-collector-config\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.909966 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.909868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-sys\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.910186 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.909995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cds5q\" (UniqueName: \"kubernetes.io/projected/09dde8d4-e623-4ee0-84ec-541cf470f1d8-kube-api-access-cds5q\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:47.910186 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.910034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-textfile\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.910186 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.910062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09dde8d4-e623-4ee0-84ec-541cf470f1d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:47.910186 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.910096 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.910186 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.910129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-tls\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.910186 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.910177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1bb9157-0e73-4254-87e9-1965229f6880-metrics-client-ca\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.910430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.910232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-wtmp\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.910430 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.910288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4pb8\" (UniqueName: \"kubernetes.io/projected/b1bb9157-0e73-4254-87e9-1965229f6880-kube-api-access-d4pb8\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:47.999974 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.999896 2575 generic.go:358] "Generic (PLEG): container finished" podID="42b467d5-fece-454f-a00a-550cc3a4e698" containerID="2e96246a112cf931aa217bba7178588e7d88809cb76743fb2035ca6f5370d745" exitCode=1 Apr 24 23:56:48.000116 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:47.999972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" event={"ID":"42b467d5-fece-454f-a00a-550cc3a4e698","Type":"ContainerDied","Data":"2e96246a112cf931aa217bba7178588e7d88809cb76743fb2035ca6f5370d745"} Apr 24 23:56:48.000315 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.000302 2575 scope.go:117] "RemoveContainer" containerID="2e96246a112cf931aa217bba7178588e7d88809cb76743fb2035ca6f5370d745" Apr 24 23:56:48.011404 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-wtmp\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011520 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4pb8\" (UniqueName: \"kubernetes.io/projected/b1bb9157-0e73-4254-87e9-1965229f6880-kube-api-access-d4pb8\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011578 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-root\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011578 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.011699 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-root\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011699 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.011699 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-wtmp\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011699 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-accelerators-collector-config\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011699 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-sys\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011959 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cds5q\" (UniqueName: \"kubernetes.io/projected/09dde8d4-e623-4ee0-84ec-541cf470f1d8-kube-api-access-cds5q\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.011959 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011746 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-textfile\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011959 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09dde8d4-e623-4ee0-84ec-541cf470f1d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.011959 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1bb9157-0e73-4254-87e9-1965229f6880-sys\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011959 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011959 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-tls\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.011959 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.011900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1bb9157-0e73-4254-87e9-1965229f6880-metrics-client-ca\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.012303 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:48.012032 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 23:56:48.012303 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:48.012100 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-tls podName:b1bb9157-0e73-4254-87e9-1965229f6880 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:48.512080945 +0000 UTC m=+192.665694488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-tls") pod "node-exporter-wjnsp" (UID: "b1bb9157-0e73-4254-87e9-1965229f6880") : secret "node-exporter-tls" not found Apr 24 23:56:48.012303 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.012033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-textfile\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.012303 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.012220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-accelerators-collector-config\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.012489 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:48.012317 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 23:56:48.012489 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:48.012377 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-tls podName:09dde8d4-e623-4ee0-84ec-541cf470f1d8 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:48.512361088 +0000 UTC m=+192.665974633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-sdw6h" (UID: "09dde8d4-e623-4ee0-84ec-541cf470f1d8") : secret "openshift-state-metrics-tls" not found Apr 24 23:56:48.012695 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.012676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1bb9157-0e73-4254-87e9-1965229f6880-metrics-client-ca\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.012757 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.012695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09dde8d4-e623-4ee0-84ec-541cf470f1d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.014554 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.014527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.014689 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.014672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.025920 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.025894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cds5q\" (UniqueName: \"kubernetes.io/projected/09dde8d4-e623-4ee0-84ec-541cf470f1d8-kube-api-access-cds5q\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.026047 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.026030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4pb8\" (UniqueName: \"kubernetes.io/projected/b1bb9157-0e73-4254-87e9-1965229f6880-kube-api-access-d4pb8\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.515635 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.515590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-tls\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.515899 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.515703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.518060 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.518034 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09dde8d4-e623-4ee0-84ec-541cf470f1d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sdw6h\" (UID: \"09dde8d4-e623-4ee0-84ec-541cf470f1d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.518176 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.518061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b1bb9157-0e73-4254-87e9-1965229f6880-node-exporter-tls\") pod \"node-exporter-wjnsp\" (UID: \"b1bb9157-0e73-4254-87e9-1965229f6880\") " pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.679250 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.679190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" Apr 24 23:56:48.690016 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.689982 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wjnsp" Apr 24 23:56:48.698235 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:48.698190 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1bb9157_0e73_4254_87e9_1965229f6880.slice/crio-b2d7d93fc1413fa6d27e35788f61a7582eb214047c0054fba5465a622db73ec9 WatchSource:0}: Error finding container b2d7d93fc1413fa6d27e35788f61a7582eb214047c0054fba5465a622db73ec9: Status 404 returned error can't find the container with id b2d7d93fc1413fa6d27e35788f61a7582eb214047c0054fba5465a622db73ec9 Apr 24 23:56:48.721434 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.721405 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:56:48.801321 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.801239 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h"] Apr 24 23:56:48.805792 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:48.805754 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09dde8d4_e623_4ee0_84ec_541cf470f1d8.slice/crio-38b7cb5b6cc2ad9f6d614a5a3c2aa527b677cc4bfe87c1ead890f89efc560127 WatchSource:0}: Error finding container 38b7cb5b6cc2ad9f6d614a5a3c2aa527b677cc4bfe87c1ead890f89efc560127: Status 404 returned error can't find the container with id 38b7cb5b6cc2ad9f6d614a5a3c2aa527b677cc4bfe87c1ead890f89efc560127 Apr 24 23:56:48.821165 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.821135 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:48.825721 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.825703 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.827943 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.827916 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 23:56:48.828067 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.828046 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 23:56:48.828188 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.828169 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 23:56:48.828255 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.828206 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-95brv\"" Apr 24 23:56:48.828337 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.828318 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 23:56:48.828635 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.828617 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 23:56:48.829463 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.829313 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 23:56:48.829463 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.829346 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 23:56:48.829463 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.829361 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 23:56:48.829463 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.829370 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 23:56:48.835102 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.835082 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:48.919591 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919739 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vjqq\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-kube-api-access-5vjqq\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919739 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919739 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-volume\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919739 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919733 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919922 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919922 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919922 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919922 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.919922 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-out\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.920089 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.920089 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:48.920089 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:48.919960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-web-config\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.003956 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.003923 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wjnsp" event={"ID":"b1bb9157-0e73-4254-87e9-1965229f6880","Type":"ContainerStarted","Data":"b2d7d93fc1413fa6d27e35788f61a7582eb214047c0054fba5465a622db73ec9"} Apr 24 23:56:49.005418 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.005388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" event={"ID":"42b467d5-fece-454f-a00a-550cc3a4e698","Type":"ContainerStarted","Data":"927d9bc6fae31e127fda46244688ee5b2deed76f1655695d06d180905fb50e7d"} Apr 24 23:56:49.005597 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.005561 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:56:49.006289 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.006261 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84f9bc9448-7qqkz" Apr 24 23:56:49.007082 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.007063 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" event={"ID":"09dde8d4-e623-4ee0-84ec-541cf470f1d8","Type":"ContainerStarted","Data":"91f50f5fc90bafe80e3b3262af3b6bb7e8b3ab7aa2f0bc8b109bb28ccd1f9298"} Apr 24 23:56:49.007082 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.007086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" event={"ID":"09dde8d4-e623-4ee0-84ec-541cf470f1d8","Type":"ContainerStarted","Data":"b9e5f61e83d22278279724dcc4a76c0d028779342bbfa07968d2e2f62cc6f0ba"} Apr 24 23:56:49.007248 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.007096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" event={"ID":"09dde8d4-e623-4ee0-84ec-541cf470f1d8","Type":"ContainerStarted","Data":"38b7cb5b6cc2ad9f6d614a5a3c2aa527b677cc4bfe87c1ead890f89efc560127"} Apr 24 23:56:49.021096 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021226 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vjqq\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-kube-api-access-5vjqq\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021226 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021339 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-volume\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021339 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021339 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021469 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021469 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021469 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021469 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-out\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021653 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021653 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021653 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-web-config\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021818 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.021656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.021818 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:49.021762 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle podName:cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:49.521742707 +0000 UTC m=+193.675356256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519") : configmap references non-existent config key: ca-bundle.crt Apr 24 23:56:49.022801 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.022717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.022970 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:49.022907 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 23:56:49.022970 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:56:49.022968 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls podName:cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:49.522951213 +0000 UTC m=+193.676564755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519") : secret "alertmanager-main-tls" not found Apr 24 23:56:49.024735 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.024714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.025031 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.024826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.025668 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.025634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.025668 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.025655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.025815 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.025682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-out\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.026019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.026000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-volume\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.026414 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.026391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.026463 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.026431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-web-config\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.033821 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.033772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vjqq\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-kube-api-access-5vjqq\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.526595 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.526561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.526904 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.526652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.527832 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.527772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.530077 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.530029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:49.750827 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:49.750791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:50.011235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:50.011205 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1bb9157-0e73-4254-87e9-1965229f6880" containerID="719dada1898cc5b277d2235a0a131ba69924257aa848db9f49e82e5405689a19" exitCode=0 Apr 24 23:56:50.011611 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:50.011291 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wjnsp" event={"ID":"b1bb9157-0e73-4254-87e9-1965229f6880","Type":"ContainerDied","Data":"719dada1898cc5b277d2235a0a131ba69924257aa848db9f49e82e5405689a19"} Apr 24 23:56:50.026687 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:50.026652 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:50.031430 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:50.031407 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd1d3ed0_d9e2_4c22_9e18_7ac9ff9e6519.slice/crio-038f1e0e05894e024d6424c36494091ad14973d61d7aec74d095f057f50a590c WatchSource:0}: Error finding container 038f1e0e05894e024d6424c36494091ad14973d61d7aec74d095f057f50a590c: Status 404 returned error can't find the container with id 038f1e0e05894e024d6424c36494091ad14973d61d7aec74d095f057f50a590c Apr 24 23:56:51.015898 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:51.015858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" event={"ID":"09dde8d4-e623-4ee0-84ec-541cf470f1d8","Type":"ContainerStarted","Data":"6e4869445c7b493f87549010fbc90535ca71091facf47402667f81273d8ce6af"} Apr 24 23:56:51.017108 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:51.017080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerStarted","Data":"038f1e0e05894e024d6424c36494091ad14973d61d7aec74d095f057f50a590c"} Apr 24 23:56:51.018919 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:51.018884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wjnsp" event={"ID":"b1bb9157-0e73-4254-87e9-1965229f6880","Type":"ContainerStarted","Data":"7698c326c2fddf8ebf28df19a9ef54201874e94be4ca05cbc2c00d32602d8cef"} Apr 24 23:56:51.019141 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:51.018921 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wjnsp" event={"ID":"b1bb9157-0e73-4254-87e9-1965229f6880","Type":"ContainerStarted","Data":"7b2ce9029d57bbb81890fb71dd870ba1555b62147422383c43a8a7ec5c996c5d"} Apr 24 23:56:51.034806 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:51.034737 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sdw6h" podStartSLOduration=3.021073745 podStartE2EDuration="4.034715998s" podCreationTimestamp="2026-04-24 23:56:47 +0000 UTC" firstStartedPulling="2026-04-24 23:56:48.93203877 +0000 UTC m=+193.085652299" lastFinishedPulling="2026-04-24 23:56:49.945681019 +0000 UTC m=+194.099294552" observedRunningTime="2026-04-24 23:56:51.032201607 +0000 UTC m=+195.185815205" watchObservedRunningTime="2026-04-24 23:56:51.034715998 +0000 UTC m=+195.188329549" Apr 24 23:56:51.051003 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:51.050948 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wjnsp" podStartSLOduration=3.36581259 podStartE2EDuration="4.050932579s" podCreationTimestamp="2026-04-24 23:56:47 +0000 UTC" firstStartedPulling="2026-04-24 23:56:48.700117627 +0000 UTC m=+192.853731155" lastFinishedPulling="2026-04-24 23:56:49.385237613 +0000 UTC m=+193.538851144" observedRunningTime="2026-04-24 23:56:51.050364958 +0000 UTC m=+195.203978532" watchObservedRunningTime="2026-04-24 23:56:51.050932579 +0000 UTC m=+195.204546207" Apr 24 23:56:52.022851 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.022814 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerID="b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151" exitCode=0 Apr 24 23:56:52.023289 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.022910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerDied","Data":"b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151"} Apr 24 23:56:52.591976 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.591929 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf"] Apr 24 23:56:52.595226 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.595190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" Apr 24 23:56:52.597754 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.597728 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 23:56:52.597917 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.597798 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-cvdcl\"" Apr 24 23:56:52.607939 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.607912 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf"] Apr 24 23:56:52.655254 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.655214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54ddef74-ad26-4a93-a6a9-689019b22fd7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9jzcf\" (UID: \"54ddef74-ad26-4a93-a6a9-689019b22fd7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" Apr 24 23:56:52.756350 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.756317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54ddef74-ad26-4a93-a6a9-689019b22fd7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9jzcf\" (UID: \"54ddef74-ad26-4a93-a6a9-689019b22fd7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" Apr 24 23:56:52.759665 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.759620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54ddef74-ad26-4a93-a6a9-689019b22fd7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9jzcf\" (UID: \"54ddef74-ad26-4a93-a6a9-689019b22fd7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" Apr 24 23:56:52.907258 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:52.907222 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" Apr 24 23:56:53.040631 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.040600 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf"] Apr 24 23:56:53.264373 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:53.264304 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ddef74_ad26_4a93_a6a9_689019b22fd7.slice/crio-f9f523d8d7bff3bffec1da10a58e8de2be4d648c1927e405af97f31368f11c33 WatchSource:0}: Error finding container f9f523d8d7bff3bffec1da10a58e8de2be4d648c1927e405af97f31368f11c33: Status 404 returned error can't find the container with id f9f523d8d7bff3bffec1da10a58e8de2be4d648c1927e405af97f31368f11c33 Apr 24 23:56:53.960384 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.960343 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:53.964334 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.964308 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:53.967045 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.967020 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 23:56:53.967045 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.967043 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 23:56:53.967209 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.967069 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 23:56:53.967209 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.967024 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 23:56:53.967511 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.967483 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 23:56:53.967588 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.967541 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 23:56:53.968394 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.968372 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 23:56:53.968394 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.968379 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4aqfllkh3n0nv\"" Apr 24 23:56:53.968864 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.968669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 23:56:53.968864 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.968715 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 23:56:53.968864 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.968720 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sjtmx\"" Apr 24 23:56:53.968864 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.968807 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 23:56:53.969113 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.968889 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 23:56:53.970301 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.970280 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 23:56:53.973817 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.973795 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 23:56:53.976503 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:53.976482 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:54.032964 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.032926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerStarted","Data":"f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142"} Apr 24 23:56:54.032964 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.032969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerStarted","Data":"ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3"} Apr 24 23:56:54.033167 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.032984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerStarted","Data":"a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c"} Apr 24 23:56:54.033167 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.033001 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerStarted","Data":"38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e"} Apr 24 23:56:54.033167 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.033015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerStarted","Data":"5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c"} Apr 24 23:56:54.037303 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.037264 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" event={"ID":"54ddef74-ad26-4a93-a6a9-689019b22fd7","Type":"ContainerStarted","Data":"f9f523d8d7bff3bffec1da10a58e8de2be4d648c1927e405af97f31368f11c33"} Apr 24 23:56:54.070838 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.070794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.070853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.070902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.070933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.070959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbjt7\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-kube-api-access-tbjt7\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071108 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-web-config\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071311 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071893 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071344 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071893 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071893 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config-out\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071893 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071893 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.071893 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.071519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172150 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172150 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172397 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172397 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172397 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172397 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbjt7\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-kube-api-access-tbjt7\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172397 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172397 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-web-config\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172397 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config-out\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.172746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.173208 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.172986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.173208 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.173188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.173344 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.173319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.173814 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.173461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.174759 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.174731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.175657 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.175632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.176256 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.175970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.176256 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.176072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.176256 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.176149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.176256 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.176180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.177273 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.177230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.177367 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.177315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.178420 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.178375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config-out\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.179006 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.178668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.179099 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.179059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.179260 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.179239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.179682 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.179663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-web-config\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.180966 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.180940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbjt7\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-kube-api-access-tbjt7\") pod \"prometheus-k8s-0\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.278487 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.278390 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:54.662498 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:54.662462 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:54.664600 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:56:54.664571 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56404ccf_ab1e_4c6f_bdb4_44a1718d54dc.slice/crio-ce65c9bc4e631af8a8970cf9d1eb5285951e1c8528156d12104ea3bc211c4bd5 WatchSource:0}: Error finding container ce65c9bc4e631af8a8970cf9d1eb5285951e1c8528156d12104ea3bc211c4bd5: Status 404 returned error can't find the container with id ce65c9bc4e631af8a8970cf9d1eb5285951e1c8528156d12104ea3bc211c4bd5 Apr 24 23:56:55.042669 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.042630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerStarted","Data":"311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77"} Apr 24 23:56:55.044042 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.044016 2575 generic.go:358] "Generic (PLEG): container finished" podID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" exitCode=0 Apr 24 23:56:55.044165 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.044092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerDied","Data":"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166"} Apr 24 23:56:55.044165 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.044132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerStarted","Data":"ce65c9bc4e631af8a8970cf9d1eb5285951e1c8528156d12104ea3bc211c4bd5"} Apr 24 23:56:55.045406 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.045372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" event={"ID":"54ddef74-ad26-4a93-a6a9-689019b22fd7","Type":"ContainerStarted","Data":"a266d181244739443b69c00a0b9a19cca5ce9350ed97ecfdb799703afb3b3b95"} Apr 24 23:56:55.045600 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.045585 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" Apr 24 23:56:55.050427 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.050407 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" Apr 24 23:56:55.069431 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.069375 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.577183023 podStartE2EDuration="7.06935821s" podCreationTimestamp="2026-04-24 23:56:48 +0000 UTC" firstStartedPulling="2026-04-24 23:56:50.036668733 +0000 UTC m=+194.190282269" lastFinishedPulling="2026-04-24 23:56:54.528843927 +0000 UTC m=+198.682457456" observedRunningTime="2026-04-24 23:56:55.067496247 +0000 UTC m=+199.221109799" watchObservedRunningTime="2026-04-24 23:56:55.06935821 +0000 UTC m=+199.222971768" Apr 24 23:56:55.111066 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:55.111016 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9jzcf" podStartSLOduration=1.849226874 podStartE2EDuration="3.111001119s" podCreationTimestamp="2026-04-24 23:56:52 +0000 UTC" firstStartedPulling="2026-04-24 23:56:53.266282084 +0000 UTC m=+197.419895613" lastFinishedPulling="2026-04-24 23:56:54.528056314 +0000 UTC m=+198.681669858" observedRunningTime="2026-04-24 23:56:55.10958253 +0000 UTC m=+199.263196082" watchObservedRunningTime="2026-04-24 23:56:55.111001119 +0000 UTC m=+199.264614670" Apr 24 23:56:58.057191 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:58.057151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerStarted","Data":"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb"} Apr 24 23:56:58.057191 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:56:58.057197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerStarted","Data":"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e"} Apr 24 23:57:00.065878 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:00.065837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerStarted","Data":"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7"} Apr 24 23:57:00.065878 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:00.065872 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerStarted","Data":"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545"} Apr 24 23:57:00.065878 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:00.065882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerStarted","Data":"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f"} Apr 24 23:57:00.066306 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:00.065891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerStarted","Data":"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee"} Apr 24 23:57:00.092574 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:00.092519 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.893126432 podStartE2EDuration="7.0925031s" podCreationTimestamp="2026-04-24 23:56:53 +0000 UTC" firstStartedPulling="2026-04-24 23:56:55.045227664 +0000 UTC m=+199.198841196" lastFinishedPulling="2026-04-24 23:56:59.244604335 +0000 UTC m=+203.398217864" observedRunningTime="2026-04-24 23:57:00.090277427 +0000 UTC m=+204.243890978" watchObservedRunningTime="2026-04-24 23:57:00.0925031 +0000 UTC m=+204.246116651" Apr 24 23:57:00.558662 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:00.558616 2575 patch_prober.go:28] interesting pod/image-registry-588b569ff5-gxtpm container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 23:57:00.558830 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:00.558680 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" podUID="f0f5a613-6fd5-430e-ad23-4b5e9282d16b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:57:01.989099 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:01.989069 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-588b569ff5-gxtpm" Apr 24 23:57:04.279049 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:04.279000 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:34.514111 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:34.514081 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rpxzt_60d87af7-f253-4e76-9605-d6707237c596/dns-node-resolver/0.log" Apr 24 23:57:39.177539 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:39.177508 2575 generic.go:358] "Generic (PLEG): container finished" podID="0ff7fe59-d120-4f53-9df4-e0b4aa229a5e" containerID="d8e6c27a62263942c10fa3f54660277a2809301824eb4b89f78745a4c795487f" exitCode=0 Apr 24 23:57:39.177962 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:39.177585 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" event={"ID":"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e","Type":"ContainerDied","Data":"d8e6c27a62263942c10fa3f54660277a2809301824eb4b89f78745a4c795487f"} Apr 24 23:57:39.178020 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:39.177960 2575 scope.go:117] "RemoveContainer" containerID="d8e6c27a62263942c10fa3f54660277a2809301824eb4b89f78745a4c795487f" Apr 24 23:57:39.178887 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:39.178864 2575 generic.go:358] "Generic (PLEG): container finished" podID="f79704e7-d789-4f4c-8f4b-b4183bea75dd" containerID="d6674efcd22995292800e76fd8531614c153c102b15106a2fe85038ebb927da1" exitCode=0 Apr 24 23:57:39.178965 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:39.178907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" event={"ID":"f79704e7-d789-4f4c-8f4b-b4183bea75dd","Type":"ContainerDied","Data":"d6674efcd22995292800e76fd8531614c153c102b15106a2fe85038ebb927da1"} Apr 24 23:57:39.179197 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:39.179181 2575 scope.go:117] "RemoveContainer" containerID="d6674efcd22995292800e76fd8531614c153c102b15106a2fe85038ebb927da1" Apr 24 23:57:40.182908 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:40.182874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tgx7m" event={"ID":"f79704e7-d789-4f4c-8f4b-b4183bea75dd","Type":"ContainerStarted","Data":"b214862f7071c3ffc609826a5a1f240015e63edeb62e5b61a747aec3140887eb"} Apr 24 23:57:40.184422 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:40.184396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xqmlq" event={"ID":"0ff7fe59-d120-4f53-9df4-e0b4aa229a5e","Type":"ContainerStarted","Data":"faca5d16c3b04657c8ea1d9e083912fe564d67759b5a3fc164ecdb861e883d1a"} Apr 24 23:57:47.123104 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:47.123067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:57:47.125746 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:47.125715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/148a2391-987d-4318-b295-01018903ff94-metrics-certs\") pod \"network-metrics-daemon-xvbxz\" (UID: \"148a2391-987d-4318-b295-01018903ff94\") " pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:57:47.213790 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:47.213731 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qtxb2\"" Apr 24 23:57:47.221550 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:47.221513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xvbxz" Apr 24 23:57:47.374734 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:47.374658 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xvbxz"] Apr 24 23:57:47.377769 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:57:47.377725 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod148a2391_987d_4318_b295_01018903ff94.slice/crio-65fd1ec381df25f2d40223c716c9dcfc21237c8ec373fbb2a11ebfe1983596de WatchSource:0}: Error finding container 65fd1ec381df25f2d40223c716c9dcfc21237c8ec373fbb2a11ebfe1983596de: Status 404 returned error can't find the container with id 65fd1ec381df25f2d40223c716c9dcfc21237c8ec373fbb2a11ebfe1983596de Apr 24 23:57:48.210847 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:48.210811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xvbxz" event={"ID":"148a2391-987d-4318-b295-01018903ff94","Type":"ContainerStarted","Data":"65fd1ec381df25f2d40223c716c9dcfc21237c8ec373fbb2a11ebfe1983596de"} Apr 24 23:57:49.214729 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:49.214692 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xvbxz" event={"ID":"148a2391-987d-4318-b295-01018903ff94","Type":"ContainerStarted","Data":"d58a68613b69a0c3f26eab93a1e91bbcc25319e85846803393d6d70ae9010964"} Apr 24 23:57:49.214729 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:49.214728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xvbxz" event={"ID":"148a2391-987d-4318-b295-01018903ff94","Type":"ContainerStarted","Data":"2a26739271d6ea09f56eb399d1baf98ef8a50a47df4b147c4299eda59f81d487"} Apr 24 23:57:49.233524 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:49.233475 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xvbxz" podStartSLOduration=252.188475591 podStartE2EDuration="4m13.233461647s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:57:47.379913513 +0000 UTC m=+251.533527083" lastFinishedPulling="2026-04-24 23:57:48.424899606 +0000 UTC m=+252.578513139" observedRunningTime="2026-04-24 23:57:49.230870517 +0000 UTC m=+253.384484068" watchObservedRunningTime="2026-04-24 23:57:49.233461647 +0000 UTC m=+253.387075197" Apr 24 23:57:54.279381 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:54.279342 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:54.299595 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:54.299563 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:55.247998 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:57:55.247974 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:07.987053 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:07.987016 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:58:07.987479 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:07.987435 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="alertmanager" containerID="cri-o://5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c" gracePeriod=120 Apr 24 23:58:07.987583 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:07.987513 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy-web" containerID="cri-o://a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c" gracePeriod=120 Apr 24 23:58:07.987645 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:07.987601 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="config-reloader" containerID="cri-o://38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e" gracePeriod=120 Apr 24 23:58:07.987701 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:07.987540 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy" containerID="cri-o://ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3" gracePeriod=120 Apr 24 23:58:07.987701 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:07.987642 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="prom-label-proxy" containerID="cri-o://311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77" gracePeriod=120 Apr 24 23:58:07.987835 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:07.987535 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy-metric" containerID="cri-o://f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142" gracePeriod=120 Apr 24 23:58:08.273793 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273712 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerID="311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77" exitCode=0 Apr 24 23:58:08.273793 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273734 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerID="f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142" exitCode=0 Apr 24 23:58:08.273793 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273740 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerID="ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3" exitCode=0 Apr 24 23:58:08.273793 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273745 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerID="38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e" exitCode=0 Apr 24 23:58:08.273793 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273750 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerID="5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c" exitCode=0 Apr 24 23:58:08.274010 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerDied","Data":"311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77"} Apr 24 23:58:08.274010 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerDied","Data":"f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142"} Apr 24 23:58:08.274010 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerDied","Data":"ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3"} Apr 24 23:58:08.274010 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerDied","Data":"38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e"} Apr 24 23:58:08.274010 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:08.273860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerDied","Data":"5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c"} Apr 24 23:58:09.233079 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.233058 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.280027 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.279998 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerID="a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c" exitCode=0 Apr 24 23:58:09.280146 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.280041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerDied","Data":"a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c"} Apr 24 23:58:09.280146 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.280109 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519","Type":"ContainerDied","Data":"038f1e0e05894e024d6424c36494091ad14973d61d7aec74d095f057f50a590c"} Apr 24 23:58:09.280146 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.280126 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.280281 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.280128 2575 scope.go:117] "RemoveContainer" containerID="311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77" Apr 24 23:58:09.288863 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.288748 2575 scope.go:117] "RemoveContainer" containerID="f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142" Apr 24 23:58:09.296055 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.296040 2575 scope.go:117] "RemoveContainer" containerID="ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3" Apr 24 23:58:09.304158 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.304140 2575 scope.go:117] "RemoveContainer" containerID="a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c" Apr 24 23:58:09.304855 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.304833 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-main-db\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.304952 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.304872 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-metric\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.304952 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.304912 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.304961 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-web-config\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.304988 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-tls-assets\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305032 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vjqq\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-kube-api-access-5vjqq\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305269 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305156 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:58:09.305390 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305363 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-cluster-tls-config\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305455 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305426 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-out\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305455 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305424 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:09.305558 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305464 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-metrics-client-ca\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305558 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305492 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-web\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305558 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305516 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305714 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305562 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-volume\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.305714 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305595 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy\") pod \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\" (UID: \"cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519\") " Apr 24 23:58:09.306510 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305930 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-main-db\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.306510 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.305956 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.307551 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.307492 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:09.308141 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.308114 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:09.308432 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.308391 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-out" (OuterVolumeSpecName: "config-out") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:58:09.308432 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.308418 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-kube-api-access-5vjqq" (OuterVolumeSpecName: "kube-api-access-5vjqq") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "kube-api-access-5vjqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:58:09.309044 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.309008 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:58:09.309574 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.309541 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:09.309819 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.309773 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-volume" (OuterVolumeSpecName: "config-volume") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:09.309940 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.309922 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:09.310448 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.310431 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:09.312984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.312947 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:09.319122 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.319072 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-web-config" (OuterVolumeSpecName: "web-config") pod "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" (UID: "cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:09.326464 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.326446 2575 scope.go:117] "RemoveContainer" containerID="38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e" Apr 24 23:58:09.332509 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.332495 2575 scope.go:117] "RemoveContainer" containerID="5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c" Apr 24 23:58:09.338284 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.338270 2575 scope.go:117] "RemoveContainer" containerID="b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151" Apr 24 23:58:09.343924 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.343908 2575 scope.go:117] "RemoveContainer" containerID="311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77" Apr 24 23:58:09.344157 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:09.344138 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77\": container with ID starting with 311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77 not found: ID does not exist" containerID="311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77" Apr 24 23:58:09.344214 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.344164 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77"} err="failed to get container status \"311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77\": rpc error: code = NotFound desc = could not find container \"311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77\": container with ID starting with 311aad7a81e520693ec05eb14aee0665d72e843dd6999711137bd15908f32a77 not found: ID does not exist" Apr 24 23:58:09.344214 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.344193 2575 scope.go:117] "RemoveContainer" containerID="f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142" Apr 24 23:58:09.344415 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:09.344400 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142\": container with ID starting with f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142 not found: ID does not exist" containerID="f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142" Apr 24 23:58:09.344457 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.344420 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142"} err="failed to get container status \"f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142\": rpc error: code = NotFound desc = could not find container \"f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142\": container with ID starting with f6c0ac6a116aa5cfc4eb05ab790039fb4c1476e4dce4848f04686da87e017142 not found: ID does not exist" Apr 24 23:58:09.344457 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.344436 2575 scope.go:117] "RemoveContainer" containerID="ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3" Apr 24 23:58:09.344629 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:09.344615 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3\": container with ID starting with ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3 not found: ID does not exist" containerID="ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3" Apr 24 23:58:09.344669 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.344633 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3"} err="failed to get container status \"ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3\": rpc error: code = NotFound desc = could not find container \"ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3\": container with ID starting with ea0a0029e1182d943670719e43f47cc5da1023e5d6b4c30642eecb9dc0fa98e3 not found: ID does not exist" Apr 24 23:58:09.344669 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.344647 2575 scope.go:117] "RemoveContainer" containerID="a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c" Apr 24 23:58:09.344872 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:09.344854 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c\": container with ID starting with a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c not found: ID does not exist" containerID="a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c" Apr 24 23:58:09.344945 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.344880 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c"} err="failed to get container status \"a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c\": rpc error: code = NotFound desc = could not find container \"a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c\": container with ID starting with a2340aba72e2847448482485bdaf1bd7026b51fdb019b9ae23383325844d242c not found: ID does not exist" Apr 24 23:58:09.344945 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.344902 2575 scope.go:117] "RemoveContainer" containerID="38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e" Apr 24 23:58:09.345123 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:09.345108 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e\": container with ID starting with 38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e not found: ID does not exist" containerID="38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e" Apr 24 23:58:09.345160 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.345126 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e"} err="failed to get container status \"38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e\": rpc error: code = NotFound desc = could not find container \"38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e\": container with ID starting with 38ab5d9c6cc332ef9d77d8d689012eb8ede2915f60e4b8cd9f8d5c7fdf92173e not found: ID does not exist" Apr 24 23:58:09.345160 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.345139 2575 scope.go:117] "RemoveContainer" containerID="5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c" Apr 24 23:58:09.345368 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:09.345349 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c\": container with ID starting with 5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c not found: ID does not exist" containerID="5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c" Apr 24 23:58:09.345415 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.345380 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c"} err="failed to get container status \"5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c\": rpc error: code = NotFound desc = could not find container \"5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c\": container with ID starting with 5cf2dc3e9f5d46051d15a9a5691e5511ab9e5e16c736fab9cc0dcbd434e94a4c not found: ID does not exist" Apr 24 23:58:09.345415 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.345395 2575 scope.go:117] "RemoveContainer" containerID="b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151" Apr 24 23:58:09.345587 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:09.345571 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151\": container with ID starting with b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151 not found: ID does not exist" containerID="b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151" Apr 24 23:58:09.345624 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.345589 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151"} err="failed to get container status \"b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151\": rpc error: code = NotFound desc = could not find container \"b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151\": container with ID starting with b05c9205cc5cd59acbf36866d0116d00146e697b6a63485884ecc05d136a4151 not found: ID does not exist" Apr 24 23:58:09.407144 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407121 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-volume\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407144 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407143 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407153 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407163 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-web-config\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407172 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-tls-assets\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407180 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vjqq\" (UniqueName: \"kubernetes.io/projected/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-kube-api-access-5vjqq\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407188 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-cluster-tls-config\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407197 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-config-out\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407205 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-metrics-client-ca\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407213 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.407235 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.407221 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519-secret-alertmanager-main-tls\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:09.604149 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.604119 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:58:09.607989 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.607968 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:58:09.638983 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.638945 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:58:09.639280 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639263 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy-metric" Apr 24 23:58:09.639360 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639284 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy-metric" Apr 24 23:58:09.639360 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639303 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="alertmanager" Apr 24 23:58:09.639360 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639314 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="alertmanager" Apr 24 23:58:09.639360 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639324 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="prom-label-proxy" Apr 24 23:58:09.639360 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639333 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="prom-label-proxy" Apr 24 23:58:09.639360 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639349 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="init-config-reloader" Apr 24 23:58:09.639360 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639357 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="init-config-reloader" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639370 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy-web" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639378 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy-web" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639424 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="config-reloader" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639433 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="config-reloader" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639453 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639461 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639539 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy-web" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639553 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639565 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="alertmanager" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639575 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="kube-rbac-proxy-metric" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639585 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="prom-label-proxy" Apr 24 23:58:09.639675 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.639594 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" containerName="config-reloader" Apr 24 23:58:09.646454 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.646432 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.649024 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649005 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 23:58:09.649193 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649176 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 23:58:09.649291 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649218 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-95brv\"" Apr 24 23:58:09.649291 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649264 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 23:58:09.649391 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649297 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 23:58:09.649525 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649512 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 23:58:09.649595 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649542 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 23:58:09.649595 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649511 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 23:58:09.649595 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.649517 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 23:58:09.654895 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.654877 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:58:09.655179 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.655162 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 23:58:09.708880 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.708857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.708957 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.708882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b90943c7-0bb9-42ba-a169-6dc56578b450-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.708957 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.708902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b90943c7-0bb9-42ba-a169-6dc56578b450-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.708957 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.708919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.708957 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.708949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p42bd\" (UniqueName: \"kubernetes.io/projected/b90943c7-0bb9-42ba-a169-6dc56578b450-kube-api-access-p42bd\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.709168 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.709034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b90943c7-0bb9-42ba-a169-6dc56578b450-config-out\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.709168 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.709093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b90943c7-0bb9-42ba-a169-6dc56578b450-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.709168 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.709126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.709272 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.709178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b90943c7-0bb9-42ba-a169-6dc56578b450-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.709272 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.709197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-config-volume\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.709272 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.709215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.709272 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.709243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.709399 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.709276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-web-config\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.810317 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.810297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.810398 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.810339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b90943c7-0bb9-42ba-a169-6dc56578b450-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.810398 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.810358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-config-volume\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.810482 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.810458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.810532 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.810501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.810641 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.810535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-web-config\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.810641 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.810561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.810641 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.810586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b90943c7-0bb9-42ba-a169-6dc56578b450-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.811351 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.811330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b90943c7-0bb9-42ba-a169-6dc56578b450-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.811509 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.811493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b90943c7-0bb9-42ba-a169-6dc56578b450-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.811644 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.811626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.811751 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.811736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p42bd\" (UniqueName: \"kubernetes.io/projected/b90943c7-0bb9-42ba-a169-6dc56578b450-kube-api-access-p42bd\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.811914 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.811898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b90943c7-0bb9-42ba-a169-6dc56578b450-config-out\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.812045 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.812029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b90943c7-0bb9-42ba-a169-6dc56578b450-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813002 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.812981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b90943c7-0bb9-42ba-a169-6dc56578b450-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813354 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.813320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-config-volume\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813354 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.813320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813493 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.813379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b90943c7-0bb9-42ba-a169-6dc56578b450-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813493 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.813403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b90943c7-0bb9-42ba-a169-6dc56578b450-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813493 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.813417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813649 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.813561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813649 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.813575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.813902 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.813886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-web-config\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.815089 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.815070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b90943c7-0bb9-42ba-a169-6dc56578b450-config-out\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.815380 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.815361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b90943c7-0bb9-42ba-a169-6dc56578b450-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.822466 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.822449 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p42bd\" (UniqueName: \"kubernetes.io/projected/b90943c7-0bb9-42ba-a169-6dc56578b450-kube-api-access-p42bd\") pod \"alertmanager-main-0\" (UID: \"b90943c7-0bb9-42ba-a169-6dc56578b450\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:09.957658 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:09.957636 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:58:10.080046 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:10.080021 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:58:10.082847 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:58:10.082815 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90943c7_0bb9_42ba_a169_6dc56578b450.slice/crio-1b0d977e2207ab9e7a0d899596105a339a1d26e7460347462fd9627f450f3be1 WatchSource:0}: Error finding container 1b0d977e2207ab9e7a0d899596105a339a1d26e7460347462fd9627f450f3be1: Status 404 returned error can't find the container with id 1b0d977e2207ab9e7a0d899596105a339a1d26e7460347462fd9627f450f3be1 Apr 24 23:58:10.285224 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:10.285156 2575 generic.go:358] "Generic (PLEG): container finished" podID="b90943c7-0bb9-42ba-a169-6dc56578b450" containerID="174668940c0847e2a48f5172ea6c00392bca5c43fd77f7bd84cbf4f397fbfc75" exitCode=0 Apr 24 23:58:10.285535 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:10.285225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b90943c7-0bb9-42ba-a169-6dc56578b450","Type":"ContainerDied","Data":"174668940c0847e2a48f5172ea6c00392bca5c43fd77f7bd84cbf4f397fbfc75"} Apr 24 23:58:10.285535 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:10.285248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b90943c7-0bb9-42ba-a169-6dc56578b450","Type":"ContainerStarted","Data":"1b0d977e2207ab9e7a0d899596105a339a1d26e7460347462fd9627f450f3be1"} Apr 24 23:58:10.414317 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:10.414293 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519" path="/var/lib/kubelet/pods/cd1d3ed0-d9e2-4c22-9e18-7ac9ff9e6519/volumes" Apr 24 23:58:11.291865 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:11.291824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b90943c7-0bb9-42ba-a169-6dc56578b450","Type":"ContainerStarted","Data":"a6454be98b5edeb2fc804344475dddb50dc1fc30e335ebc6e405b1da1cd8e094"} Apr 24 23:58:11.291865 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:11.291869 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b90943c7-0bb9-42ba-a169-6dc56578b450","Type":"ContainerStarted","Data":"62403bbac5abd43070656c90cd9d425f8725ab657b8e328b32ab741bd9e0ca65"} Apr 24 23:58:11.292287 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:11.291884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b90943c7-0bb9-42ba-a169-6dc56578b450","Type":"ContainerStarted","Data":"bf6a462331130985ca74b02fde2e62a40e4c437fc3abf4b88a131f0e90cc4ce5"} Apr 24 23:58:11.292287 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:11.291896 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b90943c7-0bb9-42ba-a169-6dc56578b450","Type":"ContainerStarted","Data":"0558fedb7dc62dff3ca548cc2f524f6defc46a2e95ddbb92ba6e3efbd5751ced"} Apr 24 23:58:11.292287 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:11.291909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b90943c7-0bb9-42ba-a169-6dc56578b450","Type":"ContainerStarted","Data":"64a949572e22e285dd07d19b1416e49b55dc2ec8da5018d3253a090ac06d72e9"} Apr 24 23:58:11.292287 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:11.291922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b90943c7-0bb9-42ba-a169-6dc56578b450","Type":"ContainerStarted","Data":"864b0832d35440a10a7120ab1ba132456a73e33e3a06bfa2cff0680fc7673657"} Apr 24 23:58:11.320125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:11.320050 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.319999439 podStartE2EDuration="2.319999439s" podCreationTimestamp="2026-04-24 23:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:58:11.317570522 +0000 UTC m=+275.471184098" watchObservedRunningTime="2026-04-24 23:58:11.319999439 +0000 UTC m=+275.473612992" Apr 24 23:58:12.301703 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.299581 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:12.304707 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.304552 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="prometheus" containerID="cri-o://7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" gracePeriod=600 Apr 24 23:58:12.304707 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.304587 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="thanos-sidecar" containerID="cri-o://0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" gracePeriod=600 Apr 24 23:58:12.304707 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.304614 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy" containerID="cri-o://b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" gracePeriod=600 Apr 24 23:58:12.304707 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.304603 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy-web" containerID="cri-o://2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" gracePeriod=600 Apr 24 23:58:12.304707 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.304560 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy-thanos" containerID="cri-o://3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" gracePeriod=600 Apr 24 23:58:12.304707 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.304603 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="config-reloader" containerID="cri-o://9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" gracePeriod=600 Apr 24 23:58:12.544750 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.544727 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:12.636245 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636166 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636245 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636208 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-web-config\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636245 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636225 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-metrics-client-certs\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636500 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636246 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-kubelet-serving-ca-bundle\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636500 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636275 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-tls-assets\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636500 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636299 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636500 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636344 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-tls\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636500 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636380 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-db\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636749 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636727 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:12.636950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636835 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config-out\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636887 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-rulefiles-0\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636916 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-metrics-client-ca\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.636950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636946 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-kube-rbac-proxy\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.637174 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.636993 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-trusted-ca-bundle\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.637174 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637017 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-grpc-tls\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.637174 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637063 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbjt7\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-kube-api-access-tbjt7\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.637174 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637112 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-serving-certs-ca-bundle\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.637174 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637144 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-thanos-prometheus-http-client-file\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.637427 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637222 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config\") pod \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\" (UID: \"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc\") " Apr 24 23:58:12.637427 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637384 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:58:12.637537 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637522 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.637585 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637541 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-db\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.637854 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.637829 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:12.639092 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.639063 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:12.639865 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.639175 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.639865 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.639197 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.640040 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.640011 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:58:12.640151 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.640104 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.640218 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.640153 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:12.640282 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.640217 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.640282 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.640235 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config-out" (OuterVolumeSpecName: "config-out") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:58:12.640404 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.640297 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:12.640639 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.640614 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.640859 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.640833 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config" (OuterVolumeSpecName: "config") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.641494 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.641475 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-kube-api-access-tbjt7" (OuterVolumeSpecName: "kube-api-access-tbjt7") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "kube-api-access-tbjt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:58:12.641758 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.641742 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.642063 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.641903 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.649831 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.649764 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-web-config" (OuterVolumeSpecName: "web-config") pod "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" (UID: "56404ccf-ab1e-4c6f-bdb4-44a1718d54dc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:12.738950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.738912 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.738950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.738942 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config-out\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.738950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.738954 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.738950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.738963 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-metrics-client-ca\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.738974 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-kube-rbac-proxy\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.738985 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.738995 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-grpc-tls\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739003 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbjt7\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-kube-api-access-tbjt7\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739012 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739022 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739031 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-config\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739040 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739049 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-web-config\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739058 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-metrics-client-certs\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739067 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-tls-assets\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:12.739207 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:12.739075 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 24 23:58:13.301986 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.301956 2575 generic.go:358] "Generic (PLEG): container finished" podID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" exitCode=0 Apr 24 23:58:13.301986 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.301979 2575 generic.go:358] "Generic (PLEG): container finished" podID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" exitCode=0 Apr 24 23:58:13.301986 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.301986 2575 generic.go:358] "Generic (PLEG): container finished" podID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" exitCode=0 Apr 24 23:58:13.301986 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.301992 2575 generic.go:358] "Generic (PLEG): container finished" podID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" exitCode=0 Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.301997 2575 generic.go:358] "Generic (PLEG): container finished" podID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" exitCode=0 Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302003 2575 generic.go:358] "Generic (PLEG): container finished" podID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" exitCode=0 Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerDied","Data":"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7"} Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302058 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerDied","Data":"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545"} Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerDied","Data":"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f"} Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerDied","Data":"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee"} Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerDied","Data":"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb"} Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerDied","Data":"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e"} Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302129 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56404ccf-ab1e-4c6f-bdb4-44a1718d54dc","Type":"ContainerDied","Data":"ce65c9bc4e631af8a8970cf9d1eb5285951e1c8528156d12104ea3bc211c4bd5"} Apr 24 23:58:13.302483 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.302145 2575 scope.go:117] "RemoveContainer" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" Apr 24 23:58:13.310441 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.310311 2575 scope.go:117] "RemoveContainer" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" Apr 24 23:58:13.316843 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.316822 2575 scope.go:117] "RemoveContainer" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" Apr 24 23:58:13.325034 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.325016 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:13.328410 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.328390 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:13.332472 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.332457 2575 scope.go:117] "RemoveContainer" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" Apr 24 23:58:13.340547 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.340524 2575 scope.go:117] "RemoveContainer" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" Apr 24 23:58:13.346950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.346933 2575 scope.go:117] "RemoveContainer" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" Apr 24 23:58:13.353496 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353475 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:13.353676 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353661 2575 scope.go:117] "RemoveContainer" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" Apr 24 23:58:13.353820 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353762 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy-thanos" Apr 24 23:58:13.353820 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353794 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy-thanos" Apr 24 23:58:13.353820 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353817 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="config-reloader" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353826 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="config-reloader" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353835 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353843 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353857 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="init-config-reloader" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353866 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="init-config-reloader" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353875 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy-web" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353883 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy-web" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353898 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="prometheus" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353905 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="prometheus" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353926 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="thanos-sidecar" Apr 24 23:58:13.353984 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.353935 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="thanos-sidecar" Apr 24 23:58:13.354403 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.354007 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="config-reloader" Apr 24 23:58:13.354403 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.354019 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="prometheus" Apr 24 23:58:13.354403 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.354032 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy" Apr 24 23:58:13.354403 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.354047 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="thanos-sidecar" Apr 24 23:58:13.354403 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.354057 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy-web" Apr 24 23:58:13.354403 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.354063 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" containerName="kube-rbac-proxy-thanos" Apr 24 23:58:13.359332 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.359315 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.360021 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.360006 2575 scope.go:117] "RemoveContainer" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" Apr 24 23:58:13.360292 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:13.360271 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": container with ID starting with 3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7 not found: ID does not exist" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" Apr 24 23:58:13.360351 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.360299 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7"} err="failed to get container status \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": rpc error: code = NotFound desc = could not find container \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": container with ID starting with 3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7 not found: ID does not exist" Apr 24 23:58:13.360351 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.360315 2575 scope.go:117] "RemoveContainer" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" Apr 24 23:58:13.360533 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:13.360516 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": container with ID starting with b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545 not found: ID does not exist" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" Apr 24 23:58:13.360573 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.360538 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545"} err="failed to get container status \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": rpc error: code = NotFound desc = could not find container \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": container with ID starting with b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545 not found: ID does not exist" Apr 24 23:58:13.360573 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.360551 2575 scope.go:117] "RemoveContainer" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" Apr 24 23:58:13.360757 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:13.360743 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": container with ID starting with 2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f not found: ID does not exist" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" Apr 24 23:58:13.360881 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.360759 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f"} err="failed to get container status \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": rpc error: code = NotFound desc = could not find container \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": container with ID starting with 2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f not found: ID does not exist" Apr 24 23:58:13.360881 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.360770 2575 scope.go:117] "RemoveContainer" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" Apr 24 23:58:13.361017 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:13.360997 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": container with ID starting with 0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee not found: ID does not exist" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" Apr 24 23:58:13.361050 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.361023 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee"} err="failed to get container status \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": rpc error: code = NotFound desc = could not find container \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": container with ID starting with 0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee not found: ID does not exist" Apr 24 23:58:13.361050 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.361038 2575 scope.go:117] "RemoveContainer" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" Apr 24 23:58:13.361281 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:13.361264 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": container with ID starting with 9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb not found: ID does not exist" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" Apr 24 23:58:13.361319 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.361285 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb"} err="failed to get container status \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": rpc error: code = NotFound desc = could not find container \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": container with ID starting with 9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb not found: ID does not exist" Apr 24 23:58:13.361319 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.361300 2575 scope.go:117] "RemoveContainer" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" Apr 24 23:58:13.361530 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:13.361516 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": container with ID starting with 7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e not found: ID does not exist" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" Apr 24 23:58:13.361565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.361533 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e"} err="failed to get container status \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": rpc error: code = NotFound desc = could not find container \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": container with ID starting with 7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e not found: ID does not exist" Apr 24 23:58:13.361565 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.361545 2575 scope.go:117] "RemoveContainer" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" Apr 24 23:58:13.361760 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:13.361745 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": container with ID starting with 0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166 not found: ID does not exist" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" Apr 24 23:58:13.361822 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.361764 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166"} err="failed to get container status \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": rpc error: code = NotFound desc = could not find container \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": container with ID starting with 0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166 not found: ID does not exist" Apr 24 23:58:13.361822 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.361791 2575 scope.go:117] "RemoveContainer" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" Apr 24 23:58:13.362050 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362026 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7"} err="failed to get container status \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": rpc error: code = NotFound desc = could not find container \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": container with ID starting with 3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7 not found: ID does not exist" Apr 24 23:58:13.362050 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362048 2575 scope.go:117] "RemoveContainer" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" Apr 24 23:58:13.362199 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362155 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 23:58:13.362254 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362208 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 23:58:13.362322 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362295 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545"} err="failed to get container status \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": rpc error: code = NotFound desc = could not find container \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": container with ID starting with b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545 not found: ID does not exist" Apr 24 23:58:13.362380 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362325 2575 scope.go:117] "RemoveContainer" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" Apr 24 23:58:13.362437 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362386 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 23:58:13.362497 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362476 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 23:58:13.362551 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4aqfllkh3n0nv\"" Apr 24 23:58:13.362599 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362551 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f"} err="failed to get container status \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": rpc error: code = NotFound desc = could not find container \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": container with ID starting with 2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f not found: ID does not exist" Apr 24 23:58:13.362599 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362574 2575 scope.go:117] "RemoveContainer" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" Apr 24 23:58:13.362691 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362611 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 23:58:13.362727 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362688 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 23:58:13.362861 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362838 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee"} err="failed to get container status \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": rpc error: code = NotFound desc = could not find container \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": container with ID starting with 0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee not found: ID does not exist" Apr 24 23:58:13.362916 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362863 2575 scope.go:117] "RemoveContainer" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" Apr 24 23:58:13.362998 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.362978 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 23:58:13.363081 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363063 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 23:58:13.363130 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363089 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb"} err="failed to get container status \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": rpc error: code = NotFound desc = could not find container \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": container with ID starting with 9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb not found: ID does not exist" Apr 24 23:58:13.363130 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363105 2575 scope.go:117] "RemoveContainer" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" Apr 24 23:58:13.363323 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363304 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e"} err="failed to get container status \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": rpc error: code = NotFound desc = could not find container \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": container with ID starting with 7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e not found: ID does not exist" Apr 24 23:58:13.363375 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363325 2575 scope.go:117] "RemoveContainer" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" Apr 24 23:58:13.363581 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363548 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166"} err="failed to get container status \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": rpc error: code = NotFound desc = could not find container \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": container with ID starting with 0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166 not found: ID does not exist" Apr 24 23:58:13.363581 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363580 2575 scope.go:117] "RemoveContainer" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" Apr 24 23:58:13.363718 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363687 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 23:58:13.363718 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 23:58:13.363854 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363688 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sjtmx\"" Apr 24 23:58:13.363854 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363810 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 23:58:13.363950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363860 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7"} err="failed to get container status \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": rpc error: code = NotFound desc = could not find container \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": container with ID starting with 3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7 not found: ID does not exist" Apr 24 23:58:13.363950 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.363884 2575 scope.go:117] "RemoveContainer" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" Apr 24 23:58:13.364151 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.364125 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545"} err="failed to get container status \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": rpc error: code = NotFound desc = could not find container \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": container with ID starting with b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545 not found: ID does not exist" Apr 24 23:58:13.364230 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.364156 2575 scope.go:117] "RemoveContainer" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" Apr 24 23:58:13.364431 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.364406 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f"} err="failed to get container status \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": rpc error: code = NotFound desc = could not find container \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": container with ID starting with 2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f not found: ID does not exist" Apr 24 23:58:13.364493 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.364433 2575 scope.go:117] "RemoveContainer" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" Apr 24 23:58:13.364672 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.364651 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee"} err="failed to get container status \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": rpc error: code = NotFound desc = could not find container \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": container with ID starting with 0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee not found: ID does not exist" Apr 24 23:58:13.364672 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.364672 2575 scope.go:117] "RemoveContainer" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" Apr 24 23:58:13.364965 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.364932 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb"} err="failed to get container status \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": rpc error: code = NotFound desc = could not find container \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": container with ID starting with 9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb not found: ID does not exist" Apr 24 23:58:13.364965 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.364957 2575 scope.go:117] "RemoveContainer" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" Apr 24 23:58:13.365218 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.365196 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e"} err="failed to get container status \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": rpc error: code = NotFound desc = could not find container \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": container with ID starting with 7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e not found: ID does not exist" Apr 24 23:58:13.365218 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.365218 2575 scope.go:117] "RemoveContainer" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" Apr 24 23:58:13.365612 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.365559 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166"} err="failed to get container status \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": rpc error: code = NotFound desc = could not find container \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": container with ID starting with 0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166 not found: ID does not exist" Apr 24 23:58:13.365612 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.365586 2575 scope.go:117] "RemoveContainer" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" Apr 24 23:58:13.365750 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.365575 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 23:58:13.366019 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.365990 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7"} err="failed to get container status \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": rpc error: code = NotFound desc = could not find container \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": container with ID starting with 3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7 not found: ID does not exist" Apr 24 23:58:13.366095 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.366021 2575 scope.go:117] "RemoveContainer" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" Apr 24 23:58:13.366297 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.366271 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545"} err="failed to get container status \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": rpc error: code = NotFound desc = could not find container \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": container with ID starting with b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545 not found: ID does not exist" Apr 24 23:58:13.366368 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.366298 2575 scope.go:117] "RemoveContainer" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" Apr 24 23:58:13.366523 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.366506 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f"} err="failed to get container status \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": rpc error: code = NotFound desc = could not find container \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": container with ID starting with 2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f not found: ID does not exist" Apr 24 23:58:13.366523 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.366524 2575 scope.go:117] "RemoveContainer" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" Apr 24 23:58:13.366798 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.366737 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee"} err="failed to get container status \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": rpc error: code = NotFound desc = could not find container \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": container with ID starting with 0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee not found: ID does not exist" Apr 24 23:58:13.366798 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.366773 2575 scope.go:117] "RemoveContainer" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" Apr 24 23:58:13.367114 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.367090 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb"} err="failed to get container status \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": rpc error: code = NotFound desc = could not find container \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": container with ID starting with 9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb not found: ID does not exist" Apr 24 23:58:13.367212 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.367116 2575 scope.go:117] "RemoveContainer" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" Apr 24 23:58:13.367426 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.367399 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e"} err="failed to get container status \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": rpc error: code = NotFound desc = could not find container \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": container with ID starting with 7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e not found: ID does not exist" Apr 24 23:58:13.367493 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.367428 2575 scope.go:117] "RemoveContainer" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" Apr 24 23:58:13.367698 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.367680 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166"} err="failed to get container status \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": rpc error: code = NotFound desc = could not find container \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": container with ID starting with 0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166 not found: ID does not exist" Apr 24 23:58:13.367769 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.367700 2575 scope.go:117] "RemoveContainer" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" Apr 24 23:58:13.367978 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.367957 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7"} err="failed to get container status \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": rpc error: code = NotFound desc = could not find container \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": container with ID starting with 3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7 not found: ID does not exist" Apr 24 23:58:13.368034 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.367983 2575 scope.go:117] "RemoveContainer" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" Apr 24 23:58:13.368217 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.368199 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545"} err="failed to get container status \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": rpc error: code = NotFound desc = could not find container \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": container with ID starting with b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545 not found: ID does not exist" Apr 24 23:58:13.368284 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.368220 2575 scope.go:117] "RemoveContainer" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" Apr 24 23:58:13.368476 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.368459 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f"} err="failed to get container status \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": rpc error: code = NotFound desc = could not find container \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": container with ID starting with 2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f not found: ID does not exist" Apr 24 23:58:13.368514 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.368478 2575 scope.go:117] "RemoveContainer" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" Apr 24 23:58:13.368705 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.368686 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee"} err="failed to get container status \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": rpc error: code = NotFound desc = could not find container \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": container with ID starting with 0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee not found: ID does not exist" Apr 24 23:58:13.368752 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.368706 2575 scope.go:117] "RemoveContainer" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" Apr 24 23:58:13.368966 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.368947 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb"} err="failed to get container status \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": rpc error: code = NotFound desc = could not find container \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": container with ID starting with 9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb not found: ID does not exist" Apr 24 23:58:13.369010 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.368968 2575 scope.go:117] "RemoveContainer" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" Apr 24 23:58:13.369184 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369162 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e"} err="failed to get container status \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": rpc error: code = NotFound desc = could not find container \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": container with ID starting with 7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e not found: ID does not exist" Apr 24 23:58:13.369184 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369184 2575 scope.go:117] "RemoveContainer" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" Apr 24 23:58:13.369376 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369359 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166"} err="failed to get container status \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": rpc error: code = NotFound desc = could not find container \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": container with ID starting with 0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166 not found: ID does not exist" Apr 24 23:58:13.369442 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369378 2575 scope.go:117] "RemoveContainer" containerID="3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7" Apr 24 23:58:13.369605 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369589 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7"} err="failed to get container status \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": rpc error: code = NotFound desc = could not find container \"3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7\": container with ID starting with 3a691500503c2bd4d88c49686ad5d9ab2116380bf673fda04ffaea5e92796bf7 not found: ID does not exist" Apr 24 23:58:13.369605 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369605 2575 scope.go:117] "RemoveContainer" containerID="b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545" Apr 24 23:58:13.369744 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369729 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 23:58:13.369882 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369859 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545"} err="failed to get container status \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": rpc error: code = NotFound desc = could not find container \"b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545\": container with ID starting with b1d91c7183c51e88b8e5deeeeb13d3750307348f5717b25bdfcaf5e2ad741545 not found: ID does not exist" Apr 24 23:58:13.369882 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.369878 2575 scope.go:117] "RemoveContainer" containerID="2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f" Apr 24 23:58:13.370124 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.370089 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f"} err="failed to get container status \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": rpc error: code = NotFound desc = could not find container \"2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f\": container with ID starting with 2e0cfa6ae9e2afa6f03e95cba210e197c36ebf8c04fe99b1ede636c24e50367f not found: ID does not exist" Apr 24 23:58:13.370124 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.370109 2575 scope.go:117] "RemoveContainer" containerID="0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee" Apr 24 23:58:13.370349 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.370333 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee"} err="failed to get container status \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": rpc error: code = NotFound desc = could not find container \"0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee\": container with ID starting with 0354b1b39c07b3ea3ee39de17ec7d3433627a7bd5f2ff2954f2639f31d53a0ee not found: ID does not exist" Apr 24 23:58:13.370401 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.370351 2575 scope.go:117] "RemoveContainer" containerID="9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb" Apr 24 23:58:13.370573 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.370547 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb"} err="failed to get container status \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": rpc error: code = NotFound desc = could not find container \"9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb\": container with ID starting with 9d510b736b338f5f72530296708e596508168b0dd3a2fcc6086b7bdc6f468dbb not found: ID does not exist" Apr 24 23:58:13.370733 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.370577 2575 scope.go:117] "RemoveContainer" containerID="7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e" Apr 24 23:58:13.370873 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.370850 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e"} err="failed to get container status \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": rpc error: code = NotFound desc = could not find container \"7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e\": container with ID starting with 7e433614d5d1403fc9755926ddefee8f08cb433b9e9c88c6f6cd2d94fa46c69e not found: ID does not exist" Apr 24 23:58:13.370939 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.370876 2575 scope.go:117] "RemoveContainer" containerID="0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166" Apr 24 23:58:13.371340 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.371314 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166"} err="failed to get container status \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": rpc error: code = NotFound desc = could not find container \"0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166\": container with ID starting with 0553cd5cb0ba3fef4cf5538ea5ca36ef2b90e58a01b4c9612d93c2cf37460166 not found: ID does not exist" Apr 24 23:58:13.372903 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.372884 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:13.443929 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.443907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444034 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.443934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444034 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.443954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444034 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.443974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6622fd3d-3dad-4db6-987d-8c4d84fd7148-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444152 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444152 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444152 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444152 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv2wj\" (UniqueName: \"kubernetes.io/projected/6622fd3d-3dad-4db6-987d-8c4d84fd7148-kube-api-access-sv2wj\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444152 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444330 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444330 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444330 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444330 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444330 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444330 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6622fd3d-3dad-4db6-987d-8c4d84fd7148-config-out\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444501 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-config\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444501 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-web-config\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.444501 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.444399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv2wj\" (UniqueName: \"kubernetes.io/projected/6622fd3d-3dad-4db6-987d-8c4d84fd7148-kube-api-access-sv2wj\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545512 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545664 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545664 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545664 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545664 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545886 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545886 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6622fd3d-3dad-4db6-987d-8c4d84fd7148-config-out\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545886 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-config\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545886 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-web-config\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.545886 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.545992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6622fd3d-3dad-4db6-987d-8c4d84fd7148-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.546049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.546072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546125 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.546099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546466 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.546313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546466 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.546435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.546567 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.546460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.547371 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.547071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.548694 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.548631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.548805 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.548760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6622fd3d-3dad-4db6-987d-8c4d84fd7148-config-out\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.548805 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.548771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.549116 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.549091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.549202 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.549188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.549259 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.549211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.549734 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.549693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.549898 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.549854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.550002 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.549918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.550980 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.550956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6622fd3d-3dad-4db6-987d-8c4d84fd7148-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.551422 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.551401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-web-config\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.551629 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.551614 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6622fd3d-3dad-4db6-987d-8c4d84fd7148-config\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.552621 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.552581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6622fd3d-3dad-4db6-987d-8c4d84fd7148-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.554040 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.554021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv2wj\" (UniqueName: \"kubernetes.io/projected/6622fd3d-3dad-4db6-987d-8c4d84fd7148-kube-api-access-sv2wj\") pod \"prometheus-k8s-0\" (UID: \"6622fd3d-3dad-4db6-987d-8c4d84fd7148\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.670051 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.670023 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:13.789880 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:13.789788 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:13.791977 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:58:13.791949 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6622fd3d_3dad_4db6_987d_8c4d84fd7148.slice/crio-f49b2a84da129eda8147de067b5bef145b9a0443c3fda983e8720b70899b59c6 WatchSource:0}: Error finding container f49b2a84da129eda8147de067b5bef145b9a0443c3fda983e8720b70899b59c6: Status 404 returned error can't find the container with id f49b2a84da129eda8147de067b5bef145b9a0443c3fda983e8720b70899b59c6 Apr 24 23:58:14.311474 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:14.311441 2575 generic.go:358] "Generic (PLEG): container finished" podID="6622fd3d-3dad-4db6-987d-8c4d84fd7148" containerID="1f2d53e742cf71df1e7c364cfce3d8c232354af881c1369c6efb9637611524e8" exitCode=0 Apr 24 23:58:14.311837 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:14.311521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6622fd3d-3dad-4db6-987d-8c4d84fd7148","Type":"ContainerDied","Data":"1f2d53e742cf71df1e7c364cfce3d8c232354af881c1369c6efb9637611524e8"} Apr 24 23:58:14.311837 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:14.311550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6622fd3d-3dad-4db6-987d-8c4d84fd7148","Type":"ContainerStarted","Data":"f49b2a84da129eda8147de067b5bef145b9a0443c3fda983e8720b70899b59c6"} Apr 24 23:58:14.414905 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:14.414877 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56404ccf-ab1e-4c6f-bdb4-44a1718d54dc" path="/var/lib/kubelet/pods/56404ccf-ab1e-4c6f-bdb4-44a1718d54dc/volumes" Apr 24 23:58:14.887326 ip-10-0-128-234 kubenswrapper[2575]: E0424 23:58:14.887257 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6kzrg" podUID="f5e433d1-e134-4312-9418-0c609e10c09c" Apr 24 23:58:15.317983 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:15.317947 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6622fd3d-3dad-4db6-987d-8c4d84fd7148","Type":"ContainerStarted","Data":"cb3afd076316cd2a28b5e651314ccd4899f2e4e9d00ac0320096e6123d0e3334"} Apr 24 23:58:15.317983 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:15.317973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kzrg" Apr 24 23:58:15.317983 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:15.317984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6622fd3d-3dad-4db6-987d-8c4d84fd7148","Type":"ContainerStarted","Data":"d149be2489be545a2c98f96715d46134b8318ba9882d6b0b426b969404424e8d"} Apr 24 23:58:15.318436 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:15.317994 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6622fd3d-3dad-4db6-987d-8c4d84fd7148","Type":"ContainerStarted","Data":"a82c38115d915dea3077ff970f374fe38611096969bf56aa8c7e929cc119044f"} Apr 24 23:58:15.318436 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:15.318003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6622fd3d-3dad-4db6-987d-8c4d84fd7148","Type":"ContainerStarted","Data":"00e80309404c00e3a4073a7e9df1c4ec39225f29d2c71cefc4f8f0c0e0b01d26"} Apr 24 23:58:15.318436 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:15.318013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6622fd3d-3dad-4db6-987d-8c4d84fd7148","Type":"ContainerStarted","Data":"460cdf1d9bace22351c04a6d4e28f52caf8b347dbeb24fb04d2665d0469acc5e"} Apr 24 23:58:15.318436 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:15.318021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6622fd3d-3dad-4db6-987d-8c4d84fd7148","Type":"ContainerStarted","Data":"d9d02caad0bef0de79268c5e8e6a396a849465494fc1a9011eef0a1747638162"} Apr 24 23:58:15.348814 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:15.348751 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.348737416 podStartE2EDuration="2.348737416s" podCreationTimestamp="2026-04-24 23:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:58:15.346895182 +0000 UTC m=+279.500508733" watchObservedRunningTime="2026-04-24 23:58:15.348737416 +0000 UTC m=+279.502350967" Apr 24 23:58:18.670973 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.670936 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:18.691856 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.691814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:58:18.691961 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.691907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:58:18.694156 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.694134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5e433d1-e134-4312-9418-0c609e10c09c-metrics-tls\") pod \"dns-default-6kzrg\" (UID: \"f5e433d1-e134-4312-9418-0c609e10c09c\") " pod="openshift-dns/dns-default-6kzrg" Apr 24 23:58:18.694222 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.694193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efeb0526-c248-4f97-ad71-12762132cd18-cert\") pod \"ingress-canary-c5wq4\" (UID: \"efeb0526-c248-4f97-ad71-12762132cd18\") " pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:58:18.913466 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.913437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gqd2q\"" Apr 24 23:58:18.920869 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.920847 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c5wq4" Apr 24 23:58:18.920955 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.920939 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-drk7t\"" Apr 24 23:58:18.928919 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:18.928898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kzrg" Apr 24 23:58:19.074191 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:19.074158 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c5wq4"] Apr 24 23:58:19.080531 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:58:19.080497 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefeb0526_c248_4f97_ad71_12762132cd18.slice/crio-6b67e101c7567c96d65d835e1f9c6e63871050fc838172078dc7bb2187cc0d2c WatchSource:0}: Error finding container 6b67e101c7567c96d65d835e1f9c6e63871050fc838172078dc7bb2187cc0d2c: Status 404 returned error can't find the container with id 6b67e101c7567c96d65d835e1f9c6e63871050fc838172078dc7bb2187cc0d2c Apr 24 23:58:19.097351 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:19.097328 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kzrg"] Apr 24 23:58:19.099718 ip-10-0-128-234 kubenswrapper[2575]: W0424 23:58:19.099694 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5e433d1_e134_4312_9418_0c609e10c09c.slice/crio-7de66fdab8f88846c898455dcde33f212bcbf322cffde3e7aafff1d7801a9c7b WatchSource:0}: Error finding container 7de66fdab8f88846c898455dcde33f212bcbf322cffde3e7aafff1d7801a9c7b: Status 404 returned error can't find the container with id 7de66fdab8f88846c898455dcde33f212bcbf322cffde3e7aafff1d7801a9c7b Apr 24 23:58:19.332402 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:19.332318 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kzrg" event={"ID":"f5e433d1-e134-4312-9418-0c609e10c09c","Type":"ContainerStarted","Data":"7de66fdab8f88846c898455dcde33f212bcbf322cffde3e7aafff1d7801a9c7b"} Apr 24 23:58:19.333209 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:19.333185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c5wq4" event={"ID":"efeb0526-c248-4f97-ad71-12762132cd18","Type":"ContainerStarted","Data":"6b67e101c7567c96d65d835e1f9c6e63871050fc838172078dc7bb2187cc0d2c"} Apr 24 23:58:21.341147 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:21.341105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kzrg" event={"ID":"f5e433d1-e134-4312-9418-0c609e10c09c","Type":"ContainerStarted","Data":"532770d0be23fd8a83951559f3575e93ac3d4721e224ac08b48b071ccf4b20fd"} Apr 24 23:58:21.341147 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:21.341150 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kzrg" event={"ID":"f5e433d1-e134-4312-9418-0c609e10c09c","Type":"ContainerStarted","Data":"6f8321d69e5da4d4048b482ded41bf4bf082be4aa4441edd288d30d542c7f86f"} Apr 24 23:58:21.341592 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:21.341327 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6kzrg" Apr 24 23:58:21.342421 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:21.342401 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c5wq4" event={"ID":"efeb0526-c248-4f97-ad71-12762132cd18","Type":"ContainerStarted","Data":"10b07f1487168c5b762369d9c7f1f4b2ba0e1e1bb8e4a13fc83c1b3bfec0d0cb"} Apr 24 23:58:21.361475 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:21.361389 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6kzrg" podStartSLOduration=251.621801121 podStartE2EDuration="4m13.361373356s" podCreationTimestamp="2026-04-24 23:54:08 +0000 UTC" firstStartedPulling="2026-04-24 23:58:19.101404414 +0000 UTC m=+283.255017943" lastFinishedPulling="2026-04-24 23:58:20.840976646 +0000 UTC m=+284.994590178" observedRunningTime="2026-04-24 23:58:21.360033955 +0000 UTC m=+285.513647539" watchObservedRunningTime="2026-04-24 23:58:21.361373356 +0000 UTC m=+285.514986908" Apr 24 23:58:21.376097 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:21.376044 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-c5wq4" podStartSLOduration=251.613927564 podStartE2EDuration="4m13.376025296s" podCreationTimestamp="2026-04-24 23:54:08 +0000 UTC" firstStartedPulling="2026-04-24 23:58:19.082747819 +0000 UTC m=+283.236361349" lastFinishedPulling="2026-04-24 23:58:20.844845549 +0000 UTC m=+284.998459081" observedRunningTime="2026-04-24 23:58:21.373770909 +0000 UTC m=+285.527384462" watchObservedRunningTime="2026-04-24 23:58:21.376025296 +0000 UTC m=+285.529638847" Apr 24 23:58:31.348635 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:31.348599 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6kzrg" Apr 24 23:58:36.306081 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:36.306050 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 24 23:58:36.306558 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:36.306225 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 24 23:58:36.320630 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:58:36.320611 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:59:13.670280 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:59:13.670242 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:59:13.685880 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:59:13.685852 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:59:14.509352 ip-10-0-128-234 kubenswrapper[2575]: I0424 23:59:14.509325 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 25 00:03:36.335213 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:03:36.335183 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:03:36.336222 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:03:36.336198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:08:36.358262 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:08:36.358230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:08:36.359849 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:08:36.359828 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:13:36.380217 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:13:36.380176 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:13:36.382274 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:13:36.382254 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:18:36.404109 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:18:36.404072 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:18:36.405686 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:18:36.405662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:23:36.425635 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:23:36.425603 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:23:36.429245 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:23:36.429223 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:28:36.446601 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:28:36.446574 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:28:36.450840 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:28:36.450766 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:33:36.466944 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:33:36.466912 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:33:36.475257 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:33:36.475234 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:38:36.487890 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:38:36.487813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:38:36.499100 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:38:36.499078 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:43:36.512481 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:43:36.512453 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:43:36.525104 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:43:36.525080 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:48:36.538347 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:48:36.538315 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:48:36.545504 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:48:36.545482 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:53:36.559772 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:53:36.559736 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:53:36.567402 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:53:36.567379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:58:36.580655 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:58:36.580623 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:58:36.588720 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:58:36.588699 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:59:03.513947 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.513914 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kwqcn/must-gather-5gsjb"] Apr 25 00:59:03.517240 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.517225 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:03.519635 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.519611 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kwqcn\"/\"openshift-service-ca.crt\"" Apr 25 00:59:03.519740 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.519634 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kwqcn\"/\"kube-root-ca.crt\"" Apr 25 00:59:03.523502 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.523482 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kwqcn/must-gather-5gsjb"] Apr 25 00:59:03.526281 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.526251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-must-gather-output\") pod \"must-gather-5gsjb\" (UID: \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\") " pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:03.526363 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.526313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrdk\" (UniqueName: \"kubernetes.io/projected/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-kube-api-access-jkrdk\") pod \"must-gather-5gsjb\" (UID: \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\") " pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:03.627300 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.627262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrdk\" (UniqueName: \"kubernetes.io/projected/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-kube-api-access-jkrdk\") pod \"must-gather-5gsjb\" (UID: \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\") " pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:03.627455 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.627317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-must-gather-output\") pod \"must-gather-5gsjb\" (UID: \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\") " pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:03.627633 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.627618 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-must-gather-output\") pod \"must-gather-5gsjb\" (UID: \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\") " pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:03.634748 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.634730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrdk\" (UniqueName: \"kubernetes.io/projected/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-kube-api-access-jkrdk\") pod \"must-gather-5gsjb\" (UID: \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\") " pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:03.838169 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.838073 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:03.954260 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.954169 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kwqcn/must-gather-5gsjb"] Apr 25 00:59:03.956746 ip-10-0-128-234 kubenswrapper[2575]: W0425 00:59:03.956716 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb9f10d_570c_47c6_9ac3_bcc46f56d18e.slice/crio-8e9bab0966b18e0676f33884aac9dc0a756990fc8a2e053e26cf3e736a68ac78 WatchSource:0}: Error finding container 8e9bab0966b18e0676f33884aac9dc0a756990fc8a2e053e26cf3e736a68ac78: Status 404 returned error can't find the container with id 8e9bab0966b18e0676f33884aac9dc0a756990fc8a2e053e26cf3e736a68ac78 Apr 25 00:59:03.958359 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:03.958344 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:59:04.790177 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:04.790135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" event={"ID":"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e","Type":"ContainerStarted","Data":"8e9bab0966b18e0676f33884aac9dc0a756990fc8a2e053e26cf3e736a68ac78"} Apr 25 00:59:09.808540 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:09.808499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" event={"ID":"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e","Type":"ContainerStarted","Data":"7a778b94f62ee80732690f967be0daac53d935f132a248e2f3262dc40619718a"} Apr 25 00:59:09.808966 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:09.808545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" event={"ID":"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e","Type":"ContainerStarted","Data":"10e56ce37a1c486656ed8d0439a041572381adf1bc32f3784bad495882dace22"} Apr 25 00:59:09.823420 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:09.823374 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" podStartSLOduration=2.059284257 podStartE2EDuration="6.823359251s" podCreationTimestamp="2026-04-25 00:59:03 +0000 UTC" firstStartedPulling="2026-04-25 00:59:03.958483171 +0000 UTC m=+3928.112096700" lastFinishedPulling="2026-04-25 00:59:08.722558149 +0000 UTC m=+3932.876171694" observedRunningTime="2026-04-25 00:59:09.822728164 +0000 UTC m=+3933.976341726" watchObservedRunningTime="2026-04-25 00:59:09.823359251 +0000 UTC m=+3933.976972805" Apr 25 00:59:29.869557 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:29.869520 2575 generic.go:358] "Generic (PLEG): container finished" podID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerID="10e56ce37a1c486656ed8d0439a041572381adf1bc32f3784bad495882dace22" exitCode=0 Apr 25 00:59:29.869557 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:29.869566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" event={"ID":"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e","Type":"ContainerDied","Data":"10e56ce37a1c486656ed8d0439a041572381adf1bc32f3784bad495882dace22"} Apr 25 00:59:29.870014 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:29.869903 2575 scope.go:117] "RemoveContainer" containerID="10e56ce37a1c486656ed8d0439a041572381adf1bc32f3784bad495882dace22" Apr 25 00:59:30.285043 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:30.285013 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kwqcn_must-gather-5gsjb_4fb9f10d-570c-47c6-9ac3-bcc46f56d18e/gather/0.log" Apr 25 00:59:33.732619 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:33.732584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-csxnd_50a991ab-3e74-4c00-bb7b-b6b5ca42b15f/global-pull-secret-syncer/0.log" Apr 25 00:59:33.921486 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:33.921460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zgjcf_a3a145f6-4ef4-43fd-985d-2692fdc60a0b/konnectivity-agent/0.log" Apr 25 00:59:33.946066 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:33.946045 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-234.ec2.internal_aef0222d478965f43e6fdd10ed145026/haproxy/0.log" Apr 25 00:59:35.761632 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.761603 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kwqcn/must-gather-5gsjb"] Apr 25 00:59:35.762120 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.761812 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerName="copy" containerID="cri-o://7a778b94f62ee80732690f967be0daac53d935f132a248e2f3262dc40619718a" gracePeriod=2 Apr 25 00:59:35.763892 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.763861 2575 status_manager.go:895] "Failed to get status for pod" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" err="pods \"must-gather-5gsjb\" is forbidden: User \"system:node:ip-10-0-128-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kwqcn\": no relationship found between node 'ip-10-0-128-234.ec2.internal' and this object" Apr 25 00:59:35.765226 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.765206 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kwqcn/must-gather-5gsjb"] Apr 25 00:59:35.887818 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.887770 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kwqcn_must-gather-5gsjb_4fb9f10d-570c-47c6-9ac3-bcc46f56d18e/copy/0.log" Apr 25 00:59:35.888199 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.888170 2575 generic.go:358] "Generic (PLEG): container finished" podID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerID="7a778b94f62ee80732690f967be0daac53d935f132a248e2f3262dc40619718a" exitCode=143 Apr 25 00:59:35.987760 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.987741 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kwqcn_must-gather-5gsjb_4fb9f10d-570c-47c6-9ac3-bcc46f56d18e/copy/0.log" Apr 25 00:59:35.988157 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.988141 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:35.990245 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:35.990219 2575 status_manager.go:895] "Failed to get status for pod" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" err="pods \"must-gather-5gsjb\" is forbidden: User \"system:node:ip-10-0-128-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kwqcn\": no relationship found between node 'ip-10-0-128-234.ec2.internal' and this object" Apr 25 00:59:36.103840 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.103767 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkrdk\" (UniqueName: \"kubernetes.io/projected/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-kube-api-access-jkrdk\") pod \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\" (UID: \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\") " Apr 25 00:59:36.103840 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.103818 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-must-gather-output\") pod \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\" (UID: \"4fb9f10d-570c-47c6-9ac3-bcc46f56d18e\") " Apr 25 00:59:36.104956 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.104929 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" (UID: "4fb9f10d-570c-47c6-9ac3-bcc46f56d18e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:59:36.105873 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.105847 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-kube-api-access-jkrdk" (OuterVolumeSpecName: "kube-api-access-jkrdk") pod "4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" (UID: "4fb9f10d-570c-47c6-9ac3-bcc46f56d18e"). InnerVolumeSpecName "kube-api-access-jkrdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:59:36.204705 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.204682 2575 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-must-gather-output\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 25 00:59:36.204705 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.204703 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkrdk\" (UniqueName: \"kubernetes.io/projected/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e-kube-api-access-jkrdk\") on node \"ip-10-0-128-234.ec2.internal\" DevicePath \"\"" Apr 25 00:59:36.414047 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.414013 2575 status_manager.go:895] "Failed to get status for pod" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" err="pods \"must-gather-5gsjb\" is forbidden: User \"system:node:ip-10-0-128-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kwqcn\": no relationship found between node 'ip-10-0-128-234.ec2.internal' and this object" Apr 25 00:59:36.414638 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.414616 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" path="/var/lib/kubelet/pods/4fb9f10d-570c-47c6-9ac3-bcc46f56d18e/volumes" Apr 25 00:59:36.892829 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.892739 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kwqcn/must-gather-5gsjb" Apr 25 00:59:36.892829 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.892748 2575 scope.go:117] "RemoveContainer" containerID="7a778b94f62ee80732690f967be0daac53d935f132a248e2f3262dc40619718a" Apr 25 00:59:36.900196 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:36.900174 2575 scope.go:117] "RemoveContainer" containerID="10e56ce37a1c486656ed8d0439a041572381adf1bc32f3784bad495882dace22" Apr 25 00:59:37.492185 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.492162 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b90943c7-0bb9-42ba-a169-6dc56578b450/alertmanager/0.log" Apr 25 00:59:37.516373 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.516351 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b90943c7-0bb9-42ba-a169-6dc56578b450/config-reloader/0.log" Apr 25 00:59:37.541418 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.541375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b90943c7-0bb9-42ba-a169-6dc56578b450/kube-rbac-proxy-web/0.log" Apr 25 00:59:37.563880 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.563859 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b90943c7-0bb9-42ba-a169-6dc56578b450/kube-rbac-proxy/0.log" Apr 25 00:59:37.586788 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.586764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b90943c7-0bb9-42ba-a169-6dc56578b450/kube-rbac-proxy-metric/0.log" Apr 25 00:59:37.615621 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.615602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b90943c7-0bb9-42ba-a169-6dc56578b450/prom-label-proxy/0.log" Apr 25 00:59:37.638583 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.638565 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b90943c7-0bb9-42ba-a169-6dc56578b450/init-config-reloader/0.log" Apr 25 00:59:37.681966 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.681944 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-fqcfk_ead2f3e3-ce6d-4077-a58c-391d75b4bb6c/cluster-monitoring-operator/0.log" Apr 25 00:59:37.850649 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:37.850583 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-9jzcf_54ddef74-ad26-4a93-a6a9-689019b22fd7/monitoring-plugin/0.log" Apr 25 00:59:38.107275 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.107205 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wjnsp_b1bb9157-0e73-4254-87e9-1965229f6880/node-exporter/0.log" Apr 25 00:59:38.127016 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.126990 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wjnsp_b1bb9157-0e73-4254-87e9-1965229f6880/kube-rbac-proxy/0.log" Apr 25 00:59:38.150608 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.150590 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wjnsp_b1bb9157-0e73-4254-87e9-1965229f6880/init-textfile/0.log" Apr 25 00:59:38.176060 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.176037 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sdw6h_09dde8d4-e623-4ee0-84ec-541cf470f1d8/kube-rbac-proxy-main/0.log" Apr 25 00:59:38.196382 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.196361 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sdw6h_09dde8d4-e623-4ee0-84ec-541cf470f1d8/kube-rbac-proxy-self/0.log" Apr 25 00:59:38.216116 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.216100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sdw6h_09dde8d4-e623-4ee0-84ec-541cf470f1d8/openshift-state-metrics/0.log" Apr 25 00:59:38.258042 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.258023 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6622fd3d-3dad-4db6-987d-8c4d84fd7148/prometheus/0.log" Apr 25 00:59:38.273356 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.273339 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6622fd3d-3dad-4db6-987d-8c4d84fd7148/config-reloader/0.log" Apr 25 00:59:38.294291 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.294273 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6622fd3d-3dad-4db6-987d-8c4d84fd7148/thanos-sidecar/0.log" Apr 25 00:59:38.314910 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.314893 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6622fd3d-3dad-4db6-987d-8c4d84fd7148/kube-rbac-proxy-web/0.log" Apr 25 00:59:38.338483 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.338468 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6622fd3d-3dad-4db6-987d-8c4d84fd7148/kube-rbac-proxy/0.log" Apr 25 00:59:38.366135 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.366084 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6622fd3d-3dad-4db6-987d-8c4d84fd7148/kube-rbac-proxy-thanos/0.log" Apr 25 00:59:38.393117 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:38.393102 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6622fd3d-3dad-4db6-987d-8c4d84fd7148/init-config-reloader/0.log" Apr 25 00:59:40.115221 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.115193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/1.log" Apr 25 00:59:40.121015 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.120995 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wlwd7_c4bdc38e-3787-4844-9d7a-323427247405/console-operator/2.log" Apr 25 00:59:40.559932 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.559903 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m"] Apr 25 00:59:40.560208 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.560195 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerName="gather" Apr 25 00:59:40.560268 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.560210 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerName="gather" Apr 25 00:59:40.560268 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.560222 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerName="copy" Apr 25 00:59:40.560268 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.560227 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerName="copy" Apr 25 00:59:40.560356 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.560281 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerName="gather" Apr 25 00:59:40.560356 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.560293 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fb9f10d-570c-47c6-9ac3-bcc46f56d18e" containerName="copy" Apr 25 00:59:40.565189 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.565169 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.567833 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.567812 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-htmzv\"/\"openshift-service-ca.crt\"" Apr 25 00:59:40.567921 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.567866 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-htmzv\"/\"default-dockercfg-tmtwr\"" Apr 25 00:59:40.568898 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.568876 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-htmzv\"/\"kube-root-ca.crt\"" Apr 25 00:59:40.574881 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.574862 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m"] Apr 25 00:59:40.636478 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.636452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bdt\" (UniqueName: \"kubernetes.io/projected/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-kube-api-access-48bdt\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.636615 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.636486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-podres\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.636615 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.636583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-sys\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.636711 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.636616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-proc\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.636711 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.636639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-lib-modules\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.736987 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.736957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-sys\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.737090 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.736991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-proc\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.737090 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.737016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-lib-modules\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.737090 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.737072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-sys\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.737090 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.737083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48bdt\" (UniqueName: \"kubernetes.io/projected/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-kube-api-access-48bdt\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.737298 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.737076 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-proc\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.737298 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.737134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-podres\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.737298 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.737160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-lib-modules\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.737298 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.737255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-podres\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.744666 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.744645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bdt\" (UniqueName: \"kubernetes.io/projected/b5ea612f-4a6e-4232-a2f9-b6ff78db86f5-kube-api-access-48bdt\") pod \"perf-node-gather-daemonset-fdg2m\" (UID: \"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.875111 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.875056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:40.995241 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:40.995211 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m"] Apr 25 00:59:40.998253 ip-10-0-128-234 kubenswrapper[2575]: W0425 00:59:40.998228 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb5ea612f_4a6e_4232_a2f9_b6ff78db86f5.slice/crio-4050e64eef6533ebd1702240051b94fdc07b8b41a166bfe5fac1b48aed81e7a2 WatchSource:0}: Error finding container 4050e64eef6533ebd1702240051b94fdc07b8b41a166bfe5fac1b48aed81e7a2: Status 404 returned error can't find the container with id 4050e64eef6533ebd1702240051b94fdc07b8b41a166bfe5fac1b48aed81e7a2 Apr 25 00:59:41.488642 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:41.488615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6kzrg_f5e433d1-e134-4312-9418-0c609e10c09c/dns/0.log" Apr 25 00:59:41.507870 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:41.507842 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6kzrg_f5e433d1-e134-4312-9418-0c609e10c09c/kube-rbac-proxy/0.log" Apr 25 00:59:41.631918 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:41.631895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rpxzt_60d87af7-f253-4e76-9605-d6707237c596/dns-node-resolver/0.log" Apr 25 00:59:41.909299 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:41.909265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" event={"ID":"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5","Type":"ContainerStarted","Data":"8b879f98c7ce4d10d598af888f35032153b52e8f27b1c58ed3724e1e196a205b"} Apr 25 00:59:41.909438 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:41.909305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" event={"ID":"b5ea612f-4a6e-4232-a2f9-b6ff78db86f5","Type":"ContainerStarted","Data":"4050e64eef6533ebd1702240051b94fdc07b8b41a166bfe5fac1b48aed81e7a2"} Apr 25 00:59:41.909438 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:41.909345 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:41.925208 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:41.925170 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" podStartSLOduration=1.9251545970000001 podStartE2EDuration="1.925154597s" podCreationTimestamp="2026-04-25 00:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:59:41.924110401 +0000 UTC m=+3966.077723964" watchObservedRunningTime="2026-04-25 00:59:41.925154597 +0000 UTC m=+3966.078768148" Apr 25 00:59:42.132721 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:42.132695 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-588b569ff5-gxtpm_f0f5a613-6fd5-430e-ad23-4b5e9282d16b/registry/0.log" Apr 25 00:59:42.172648 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:42.172596 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7662p_55b4791c-ab54-4f79-a22b-f9adb92a1461/node-ca/0.log" Apr 25 00:59:43.171987 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:43.171952 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-c5wq4_efeb0526-c248-4f97-ad71-12762132cd18/serve-healthcheck-canary/0.log" Apr 25 00:59:43.540010 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:43.539942 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9l2gf_dd2d64ec-40bf-420a-8983-c9eb0c1bb070/kube-rbac-proxy/0.log" Apr 25 00:59:43.565879 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:43.565854 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9l2gf_dd2d64ec-40bf-420a-8983-c9eb0c1bb070/exporter/0.log" Apr 25 00:59:43.587432 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:43.587412 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9l2gf_dd2d64ec-40bf-420a-8983-c9eb0c1bb070/extractor/0.log" Apr 25 00:59:47.921878 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:47.921849 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-fdg2m" Apr 25 00:59:49.865529 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:49.865502 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-c58ww_19ae70e3-42b5-45c3-9397-8982cf99e3ac/migrator/0.log" Apr 25 00:59:49.885633 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:49.885602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-c58ww_19ae70e3-42b5-45c3-9397-8982cf99e3ac/graceful-termination/0.log" Apr 25 00:59:50.253735 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:50.253704 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-tgx7m_f79704e7-d789-4f4c-8f4b-b4183bea75dd/kube-storage-version-migrator-operator/1.log" Apr 25 00:59:50.254535 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:50.254519 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-tgx7m_f79704e7-d789-4f4c-8f4b-b4183bea75dd/kube-storage-version-migrator-operator/0.log" Apr 25 00:59:51.220169 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.220130 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bmv4v_53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb/kube-multus-additional-cni-plugins/0.log" Apr 25 00:59:51.242434 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.242407 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bmv4v_53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb/egress-router-binary-copy/0.log" Apr 25 00:59:51.262408 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.262384 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bmv4v_53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb/cni-plugins/0.log" Apr 25 00:59:51.282374 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.282342 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bmv4v_53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb/bond-cni-plugin/0.log" Apr 25 00:59:51.301114 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.301093 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bmv4v_53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb/routeoverride-cni/0.log" Apr 25 00:59:51.319158 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.319135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bmv4v_53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb/whereabouts-cni-bincopy/0.log" Apr 25 00:59:51.338539 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.338519 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bmv4v_53cf6d5a-8951-44fe-a1f1-b382fb7ffbdb/whereabouts-cni/0.log" Apr 25 00:59:51.594830 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.594740 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pkrwp_bc9da0e8-bb12-42fb-a6da-363511285477/kube-multus/0.log" Apr 25 00:59:51.697161 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.697139 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xvbxz_148a2391-987d-4318-b295-01018903ff94/network-metrics-daemon/0.log" Apr 25 00:59:51.716082 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:51.716063 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xvbxz_148a2391-987d-4318-b295-01018903ff94/kube-rbac-proxy/0.log" Apr 25 00:59:52.700361 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:52.700332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4822f_f11aa6c0-4d7d-4326-84df-857c34aa6e63/ovn-controller/0.log" Apr 25 00:59:52.732456 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:52.732433 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4822f_f11aa6c0-4d7d-4326-84df-857c34aa6e63/ovn-acl-logging/0.log" Apr 25 00:59:52.751142 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:52.751125 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4822f_f11aa6c0-4d7d-4326-84df-857c34aa6e63/kube-rbac-proxy-node/0.log" Apr 25 00:59:52.771268 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:52.771228 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4822f_f11aa6c0-4d7d-4326-84df-857c34aa6e63/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:59:52.791657 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:52.791638 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4822f_f11aa6c0-4d7d-4326-84df-857c34aa6e63/northd/0.log" Apr 25 00:59:52.810095 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:52.810076 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4822f_f11aa6c0-4d7d-4326-84df-857c34aa6e63/nbdb/0.log" Apr 25 00:59:52.828391 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:52.828369 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4822f_f11aa6c0-4d7d-4326-84df-857c34aa6e63/sbdb/0.log" Apr 25 00:59:52.930692 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:52.930665 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4822f_f11aa6c0-4d7d-4326-84df-857c34aa6e63/ovnkube-controller/0.log" Apr 25 00:59:54.231669 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:54.231643 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-v4tzb_9f47c623-583e-4112-8494-6b034149be3a/check-endpoints/0.log" Apr 25 00:59:54.251870 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:54.251851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-clwv5_eb26be0d-42dc-4350-8240-8da8402a51a3/network-check-target-container/0.log" Apr 25 00:59:55.115799 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:55.115743 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2w7lz_908b6dc1-8fd0-4631-8022-d81ae6d15f95/iptables-alerter/0.log" Apr 25 00:59:55.772334 ip-10-0-128-234 kubenswrapper[2575]: I0425 00:59:55.772310 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lxw9p_173885a1-11c8-47c2-a5b4-51ef670b7bc6/tuned/0.log"