Apr 23 08:14:22.084572 ip-10-0-129-53 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:14:22.525066 ip-10-0-129-53 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:22.525066 ip-10-0-129-53 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:14:22.525066 ip-10-0-129-53 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:22.525066 ip-10-0-129-53 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:14:22.525066 ip-10-0-129-53 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:22.527377 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.527272 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:14:22.529587 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529572 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:22.529587 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529587 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529590 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529593 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529596 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529599 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529602 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529605 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529607 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529611 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529614 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529617 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529621 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529623 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529628 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529632 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529635 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529638 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529641 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529644 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:22.529650 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529646 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529649 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529652 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529655 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529658 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529660 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529664 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529666 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529669 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529672 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529675 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529677 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529680 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529682 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529685 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529687 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529690 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529693 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529695 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529698 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:22.530104 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529700 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529703 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529706 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529709 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529712 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529714 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529716 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529719 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529722 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529724 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529727 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529730 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529732 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529734 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529738 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529741 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529743 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529746 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529748 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529751 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:22.530626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529754 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529757 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529759 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529762 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529764 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529767 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529769 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529772 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529776 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529780 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529783 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529785 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529788 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529791 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529793 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529796 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529799 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529801 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529804 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:22.531109 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529806 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529809 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529811 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529814 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529817 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529819 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.529822 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530190 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530195 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530198 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530201 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530204 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530206 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530209 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530212 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530214 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530217 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530219 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530222 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530225 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:22.531649 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530228 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530230 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530233 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530236 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530238 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530240 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530243 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530246 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530249 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530251 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530254 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530256 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530259 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530262 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530264 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530267 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530269 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530272 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530274 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530277 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:22.532130 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530280 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530283 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530285 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530288 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530305 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530308 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530311 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530313 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530316 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530319 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530322 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530324 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530327 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530330 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530333 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530336 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530338 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530341 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530343 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530346 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:22.532657 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530348 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530351 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530354 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530357 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530359 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530362 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530364 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530366 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530369 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530373 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530376 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530380 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530383 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530385 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530388 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530390 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530395 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530398 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530401 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530404 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:22.533162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530407 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530409 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530412 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530414 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530417 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530421 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530425 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530428 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530431 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530434 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530436 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530439 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.530442 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531155 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531165 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531172 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531176 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531181 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531184 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531188 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531193 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:14:22.533671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531197 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531200 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531204 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531207 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531210 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531213 2575 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531217 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531220 2575 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531223 2575 flags.go:64] FLAG: --cloud-config="" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531225 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531228 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531256 2575 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531260 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531263 2575 flags.go:64] FLAG: --config-dir="" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531266 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531270 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531274 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531277 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531280 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531283 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531287 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531305 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531310 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531316 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531319 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:14:22.534181 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531324 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531328 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531331 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531334 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531337 2575 flags.go:64] FLAG: --enable-server="true" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531340 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531345 2575 flags.go:64] FLAG: --event-burst="100" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531348 2575 flags.go:64] FLAG: --event-qps="50" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531351 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531354 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531357 2575 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531361 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531364 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531367 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531370 2575 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531374 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531377 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531380 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531383 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531386 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531389 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531392 2575 flags.go:64] FLAG: --feature-gates="" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531396 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531399 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531402 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:14:22.534826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531406 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531409 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531412 2575 flags.go:64] FLAG: --help="false" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531415 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531418 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531421 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531424 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531428 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531431 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531435 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531438 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531441 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531444 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531448 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531451 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531454 2575 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531457 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531460 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531463 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531466 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531469 2575 flags.go:64] FLAG: --lock-file="" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531472 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531475 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531478 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:14:22.535431 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531484 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531487 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531490 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531493 2575 flags.go:64] FLAG: --logging-format="text" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531496 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531500 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531503 2575 flags.go:64] FLAG: --manifest-url="" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531506 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531510 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531513 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531517 2575 flags.go:64] FLAG: --max-pods="110" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531520 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531523 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531527 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531530 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531533 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531536 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531539 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531548 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531551 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531554 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531558 2575 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531563 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:14:22.536059 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531569 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531572 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531575 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531578 2575 flags.go:64] FLAG: --port="10250" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531581 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531584 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-008625eb37ca7a84b" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531587 2575 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531591 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531594 2575 flags.go:64] FLAG: --register-node="true" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531596 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531599 2575 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531606 2575 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531609 2575 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531612 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531614 2575 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531619 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531622 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531625 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531628 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531631 2575 flags.go:64] FLAG: --runonce="false" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531634 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531638 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531641 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531643 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531646 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531649 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:14:22.536644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531653 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531656 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531661 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531664 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531667 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531671 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531674 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531677 2575 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531680 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531686 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531689 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531692 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531696 2575 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531699 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531702 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531705 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531708 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531711 2575 flags.go:64] FLAG: --v="2" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531716 2575 flags.go:64] FLAG: --version="false" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531719 2575 flags.go:64] FLAG: --vmodule="" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531723 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.531727 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531818 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531823 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:22.537305 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531828 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531831 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531834 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531838 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531841 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531844 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531847 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531850 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531854 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531856 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531859 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531862 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531865 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531869 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531873 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531876 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531879 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531882 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531885 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:22.537892 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531888 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531891 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531894 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531897 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531899 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531903 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531906 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531908 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531911 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531914 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531917 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531920 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531922 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531925 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531927 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531930 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531933 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531936 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531938 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:22.538490 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531941 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531943 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531947 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531950 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531952 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531955 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531958 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531962 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531965 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531967 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531970 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531972 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531975 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531978 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531980 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531983 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531985 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531988 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531991 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531993 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:22.539288 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531996 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.531998 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532001 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532003 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532006 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532008 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532011 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532014 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532016 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532019 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532021 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532023 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532026 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532029 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532033 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532036 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532038 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532041 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532043 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532048 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:22.539799 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532050 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532053 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532055 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532058 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532061 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.532063 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.532868 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.540197 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.540212 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540267 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540273 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540277 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540280 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540284 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540287 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:22.540322 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540302 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540306 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540309 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540311 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540314 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540317 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540320 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540323 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540326 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540328 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540331 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540334 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540337 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540340 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540343 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540345 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540349 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540353 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540356 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540358 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:22.540703 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540361 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540363 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540369 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540373 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540376 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540379 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540382 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540385 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540387 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540390 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540392 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540395 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540397 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540400 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540403 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540406 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540409 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540412 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540414 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:22.541230 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540417 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540420 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540423 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540425 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540428 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540431 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540433 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540436 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540438 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540441 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540444 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540446 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540449 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540452 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540455 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540457 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540460 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540463 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540465 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540468 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:22.541696 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540471 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540474 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540476 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540479 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540481 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540485 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540487 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540491 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540493 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540496 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540499 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540505 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540508 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540511 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540514 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540516 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540519 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540521 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540524 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540526 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:22.542181 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540529 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.540534 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540624 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540629 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540633 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540636 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540639 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540641 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540644 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540647 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540649 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540652 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540654 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540657 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540659 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:22.542755 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540662 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540665 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540667 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540670 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540673 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540676 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540679 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540682 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540684 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540688 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540690 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540693 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540696 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540699 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540701 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540704 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540707 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540710 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540712 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540715 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:22.543113 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540718 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540721 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540724 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540726 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540729 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540731 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540734 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540736 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540739 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540741 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540743 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540746 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540748 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540751 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540755 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540757 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540760 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540763 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540765 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540768 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:22.543614 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540770 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540773 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540776 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540780 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540784 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540788 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540792 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540795 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540798 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540801 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540803 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540806 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540809 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540811 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540814 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540816 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540819 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540822 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540825 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:22.544087 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540828 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540830 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540833 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540835 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540838 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540841 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540844 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540846 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540849 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540852 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540854 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540857 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540859 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:22.540862 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.540867 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:22.544573 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.540964 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:14:22.544940 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.544102 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:14:22.545977 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.545965 2575 server.go:1019] "Starting client certificate rotation" Apr 23 08:14:22.546079 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.546062 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:14:22.546109 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.546104 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:14:22.572504 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.572488 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:14:22.575145 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.575125 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:14:22.589398 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.589380 2575 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:14:22.596472 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.596454 2575 log.go:25] "Validated CRI v1 image API" Apr 23 08:14:22.597929 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.597913 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:14:22.604022 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.604003 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 95c01802-04ad-4804-9809-47402f889dc1:/dev/nvme0n1p3 c413b63c-7c13-4f04-8524-6c18a4b7eab6:/dev/nvme0n1p4] Apr 23 08:14:22.604097 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.604021 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:14:22.605441 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.605422 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:14:22.609666 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.609566 2575 manager.go:217] Machine: {Timestamp:2026-04-23 08:14:22.607602279 +0000 UTC m=+0.405992738 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096869 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2179f5368e85fd6845191fcdd10613 SystemUUID:ec2179f5-368e-85fd-6845-191fcdd10613 BootID:7242728b-bdfb-4d94-ab0d-2e495797cbf6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:42:03:eb:43:c1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:42:03:eb:43:c1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:e3:14:63:6d:13 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:14:22.609666 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.609660 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:14:22.609786 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.609766 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:14:22.610660 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.610636 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:14:22.610812 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.610662 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-53.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:14:22.610856 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.610821 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:14:22.610856 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.610830 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:14:22.610856 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.610843 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:14:22.610856 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.610856 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:14:22.612153 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.612143 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:14:22.612279 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.612270 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:14:22.614906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.614898 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:14:22.614938 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.614915 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:14:22.614938 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.614926 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:14:22.614938 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.614936 2575 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:14:22.615038 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.614944 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:14:22.616062 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.616043 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:14:22.616062 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.616059 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:14:22.619042 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.619025 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:14:22.620406 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.620392 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:14:22.622475 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622463 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:14:22.622512 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622494 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:14:22.622512 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622505 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:14:22.622512 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622511 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:14:22.622600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622517 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:14:22.622600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622523 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:14:22.622600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622529 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:14:22.622600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622535 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:14:22.622600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622542 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:14:22.622600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622551 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:14:22.622600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622564 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:14:22.622600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.622575 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:14:22.623368 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.623359 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:14:22.623368 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.623369 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:14:22.626449 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.626327 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f5mzr" Apr 23 08:14:22.626741 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.626730 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:14:22.626773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.626767 2575 server.go:1295] "Started kubelet" Apr 23 08:14:22.626872 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.626841 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:14:22.627386 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.627328 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:14:22.627441 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.627418 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:14:22.627643 ip-10-0-129-53 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:14:22.628224 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.628190 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-53.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:14:22.628446 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.628381 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-53.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:14:22.628494 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.628474 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:14:22.632308 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.632280 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:14:22.632416 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.632397 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:14:22.634732 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.634710 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f5mzr" Apr 23 08:14:22.637472 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.636454 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-53.ec2.internal.18a8ee4bdede7db6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-53.ec2.internal,UID:ip-10-0-129-53.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-53.ec2.internal,},FirstTimestamp:2026-04-23 08:14:22.626741686 +0000 UTC m=+0.425132144,LastTimestamp:2026-04-23 08:14:22.626741686 +0000 UTC m=+0.425132144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-53.ec2.internal,}" Apr 23 08:14:22.638407 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.638386 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:14:22.638867 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.638848 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:14:22.639643 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.639496 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:14:22.639726 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.639648 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:14:22.639726 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.639666 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:14:22.639905 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.639774 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:22.639905 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.639787 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:14:22.639905 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.639795 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:14:22.640040 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.639943 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:14:22.640142 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.640125 2575 factory.go:55] Registering systemd factory Apr 23 08:14:22.640202 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.640186 2575 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:14:22.640450 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.640437 2575 factory.go:153] Registering CRI-O factory Apr 23 08:14:22.640524 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.640454 2575 factory.go:223] Registration of the crio container factory successfully Apr 23 08:14:22.640524 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.640508 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:14:22.640616 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.640536 2575 factory.go:103] Registering Raw factory Apr 23 08:14:22.640616 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.640556 2575 manager.go:1196] Started watching for new ooms in manager Apr 23 08:14:22.640957 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.640942 2575 manager.go:319] Starting recovery of all containers Apr 23 08:14:22.641496 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.641468 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-53.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 08:14:22.641767 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.641743 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 08:14:22.650864 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.650850 2575 manager.go:324] Recovery completed Apr 23 08:14:22.654619 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.654607 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:22.656856 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.656841 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:22.656924 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.656867 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:22.656924 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.656877 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:22.657634 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.657620 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:14:22.657634 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.657633 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:14:22.657774 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.657650 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:14:22.659923 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.659911 2575 policy_none.go:49] "None policy: Start" Apr 23 08:14:22.659985 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.659927 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:14:22.659985 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.659937 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:14:22.703847 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.703830 2575 manager.go:341] "Starting Device Plugin manager" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.703893 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.703903 2575 server.go:85] "Starting device plugin registration server" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.704105 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.704115 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.704202 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.704284 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.704335 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.704748 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:14:22.714973 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.704779 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:22.771163 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.771136 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:14:22.772382 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.772367 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:14:22.772476 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.772389 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:14:22.772476 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.772405 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:14:22.772476 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.772412 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:14:22.772476 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.772440 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:14:22.775106 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.775063 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:22.804515 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.804499 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:22.805528 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.805514 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:22.805620 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.805546 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:22.805620 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.805561 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:22.805620 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.805590 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.811116 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.811103 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.811182 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.811122 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-53.ec2.internal\": node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:22.822591 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.822573 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:22.873281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.873259 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal"] Apr 23 08:14:22.873378 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.873342 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:22.874667 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.874654 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:22.874735 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.874678 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:22.874735 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.874687 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:22.876872 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.876859 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:22.877002 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.876988 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.877053 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.877015 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:22.877521 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.877497 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:22.877582 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.877523 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:22.877582 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.877537 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:22.877582 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.877541 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:22.877582 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.877561 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:22.877582 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.877571 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:22.879653 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.879634 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.879653 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.879657 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:22.880322 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.880305 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:22.880404 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.880331 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:22.880404 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.880341 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:22.902826 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.902802 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-53.ec2.internal\" not found" node="ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.907053 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.907037 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-53.ec2.internal\" not found" node="ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.923261 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:22.923243 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:22.941625 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.941594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b0e838abf38c7589ab4ddfd6dec535f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-53.ec2.internal\" (UID: \"9b0e838abf38c7589ab4ddfd6dec535f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.941694 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.941631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63ca270e6c7ab8749d592459a749af4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal\" (UID: \"63ca270e6c7ab8749d592459a749af4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:22.941694 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:22.941657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63ca270e6c7ab8749d592459a749af4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal\" (UID: \"63ca270e6c7ab8749d592459a749af4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.024043 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.024020 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.042503 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.042444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b0e838abf38c7589ab4ddfd6dec535f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-53.ec2.internal\" (UID: \"9b0e838abf38c7589ab4ddfd6dec535f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.042503 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.042494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b0e838abf38c7589ab4ddfd6dec535f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-53.ec2.internal\" (UID: \"9b0e838abf38c7589ab4ddfd6dec535f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.042583 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.042496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63ca270e6c7ab8749d592459a749af4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal\" (UID: \"63ca270e6c7ab8749d592459a749af4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.042583 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.042533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63ca270e6c7ab8749d592459a749af4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal\" (UID: \"63ca270e6c7ab8749d592459a749af4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.042583 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.042538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63ca270e6c7ab8749d592459a749af4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal\" (UID: \"63ca270e6c7ab8749d592459a749af4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.042672 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.042588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63ca270e6c7ab8749d592459a749af4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal\" (UID: \"63ca270e6c7ab8749d592459a749af4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.124785 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.124755 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.205261 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.205239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.209981 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.209961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" Apr 23 08:14:23.225540 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.225522 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.326042 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.325980 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.426460 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.426419 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.527063 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.527035 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.545341 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.545324 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:14:23.545465 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.545448 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:14:23.627965 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.627905 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.638576 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.638558 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:14:23.638670 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.638567 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:09:22 +0000 UTC" deadline="2028-01-27 13:43:19.327888718 +0000 UTC" Apr 23 08:14:23.638670 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.638596 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15461h28m55.689296631s" Apr 23 08:14:23.647848 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.647825 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:14:23.671251 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.671227 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zldr8" Apr 23 08:14:23.678735 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.678694 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zldr8" Apr 23 08:14:23.704675 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.704653 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:23.728647 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.728624 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.799254 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:23.799227 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0e838abf38c7589ab4ddfd6dec535f.slice/crio-2c670a2f1bd4dfb2520b6c5bfa2bec997510378249a60426a3f52d7657631b60 WatchSource:0}: Error finding container 2c670a2f1bd4dfb2520b6c5bfa2bec997510378249a60426a3f52d7657631b60: Status 404 returned error can't find the container with id 2c670a2f1bd4dfb2520b6c5bfa2bec997510378249a60426a3f52d7657631b60 Apr 23 08:14:23.799575 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:23.799555 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ca270e6c7ab8749d592459a749af4f.slice/crio-3807dc3e60b4afa16521e386635a354de3b0ec400b91405cb6eca703388e7ed1 WatchSource:0}: Error finding container 3807dc3e60b4afa16521e386635a354de3b0ec400b91405cb6eca703388e7ed1: Status 404 returned error can't find the container with id 3807dc3e60b4afa16521e386635a354de3b0ec400b91405cb6eca703388e7ed1 Apr 23 08:14:23.805256 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.805241 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:14:23.829109 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.829086 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.929646 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:23.929572 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-53.ec2.internal\" not found" Apr 23 08:14:23.994779 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:23.994751 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:24.039893 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.039871 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" Apr 23 08:14:24.051318 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.051289 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:14:24.053115 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.053103 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" Apr 23 08:14:24.061474 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.061453 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:14:24.226925 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.226841 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:24.616639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.616566 2575 apiserver.go:52] "Watching apiserver" Apr 23 08:14:24.626677 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.626650 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:14:24.628739 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.628707 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578","openshift-image-registry/node-ca-ht997","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal","openshift-multus/multus-additional-cni-plugins-kx29n","openshift-multus/network-metrics-daemon-gtsb8","kube-system/konnectivity-agent-vj7h5","kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vrk6z","openshift-dns/node-resolver-sx7f6","openshift-multus/multus-dhsc2","openshift-network-diagnostics/network-check-target-b4f6z","openshift-network-operator/iptables-alerter-rtkxh","openshift-ovn-kubernetes/ovnkube-node-xlcv7"] Apr 23 08:14:24.631451 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.631427 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.633932 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.633912 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:14:24.634151 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.634130 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:14:24.634231 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.634149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.634891 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.634678 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:14:24.634891 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.634736 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:14:24.635029 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.634991 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-czlf9\"" Apr 23 08:14:24.636448 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.636421 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gk9j6\"" Apr 23 08:14:24.636559 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.636523 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:14:24.636620 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.636421 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:14:24.636770 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.636754 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:14:24.638656 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.638619 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.640788 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.640760 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:14:24.640931 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.640914 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:14:24.641014 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.640961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:24.641075 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:24.641023 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:24.641761 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.641743 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-swx5j\"" Apr 23 08:14:24.643191 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.643172 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:24.646056 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.645523 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:14:24.646056 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.645549 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:14:24.646056 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.645710 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.646056 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.645746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6rgd8\"" Apr 23 08:14:24.647963 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.647943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.648062 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.647985 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:14:24.648471 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.648363 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:24.648471 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.648376 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z2mkl\"" Apr 23 08:14:24.650085 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650046 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:14:24.650716 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650212 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k6dm6\"" Apr 23 08:14:24.650716 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650382 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:14:24.650716 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650444 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.650947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-system-cni-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.650947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-etc-kubernetes\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.650947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cni-binary-copy\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.650947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:24.650947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-cnibin\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.650947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-daemon-config\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.650947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccf4efd3-2881-48aa-80c7-44d62f243db8-serviceca\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.650947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftvl\" (UniqueName: \"kubernetes.io/projected/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-kube-api-access-gftvl\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.650974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-cni-bin\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3ee3b59-dda6-4efc-944c-7d052a7a6c46-agent-certs\") pod \"konnectivity-agent-vj7h5\" (UID: \"e3ee3b59-dda6-4efc-944c-7d052a7a6c46\") " pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3ee3b59-dda6-4efc-944c-7d052a7a6c46-konnectivity-ca\") pod \"konnectivity-agent-vj7h5\" (UID: \"e3ee3b59-dda6-4efc-944c-7d052a7a6c46\") " pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-conf-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-multus-certs\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bz6\" (UniqueName: \"kubernetes.io/projected/b040571c-3a7e-407f-8b6a-b70862f5b8c0-kube-api-access-q4bz6\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf4efd3-2881-48aa-80c7-44d62f243db8-host\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.651345 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-cni-multus\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-socket-dir-parent\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-system-cni-dir\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cnibin\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b040571c-3a7e-407f-8b6a-b70862f5b8c0-cni-binary-copy\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-netns\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-kubelet\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k9tq\" (UniqueName: \"kubernetes.io/projected/ccf4efd3-2881-48aa-80c7-44d62f243db8-kube-api-access-4k9tq\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzwf4\" (UniqueName: \"kubernetes.io/projected/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-kube-api-access-tzwf4\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-cni-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-os-release\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-k8s-cni-cncf-io\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-hostroot\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-os-release\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.651917 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.651706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.652711 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.652688 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-m8kzb\"" Apr 23 08:14:24.652986 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.652966 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:14:24.653071 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.653057 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:14:24.653132 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.653072 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:14:24.653994 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.653963 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:24.654093 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:24.654026 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:24.654093 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.654073 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.656169 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.656147 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zn7rr\"" Apr 23 08:14:24.656360 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.656339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.656810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.656790 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:14:24.656960 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.656935 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:14:24.657218 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.657192 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:24.658630 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.658611 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:14:24.659824 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.659593 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:14:24.659824 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.659779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:14:24.660406 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.660386 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:14:24.662381 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.661559 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t9gsk\"" Apr 23 08:14:24.662381 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.661614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:14:24.662381 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.661925 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:14:24.679340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.679304 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:09:23 +0000 UTC" deadline="2028-02-01 21:41:24.287445374 +0000 UTC" Apr 23 08:14:24.679340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.679330 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15589h26m59.608119297s" Apr 23 08:14:24.695669 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.695650 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:24.740672 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.740653 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:14:24.752410 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-kubelet\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.752522 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f798ddd1-abe6-46fe-8f87-51eb8f211cba-tmp-dir\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.752522 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-run-netns\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.752522 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-etc-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.752522 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-kubelet\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.752706 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovn-node-metrics-cert\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.752706 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-k8s-cni-cncf-io\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.752706 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-lib-modules\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.752706 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-var-lib-kubelet\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.752706 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752686 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-k8s-cni-cncf-io\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.752933 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f798ddd1-abe6-46fe-8f87-51eb8f211cba-hosts-file\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.752933 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.752933 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752769 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pprj\" (UniqueName: \"kubernetes.io/projected/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-kube-api-access-2pprj\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.752933 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cni-binary-copy\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.752933 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:24.752933 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752855 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-sys\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.752933 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-etc-selinux\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.752933 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-var-lib-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-cnibin\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-daemon-config\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.752985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccf4efd3-2881-48aa-80c7-44d62f243db8-serviceca\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:24.753010 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-cnibin\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:24.753082 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:25.253050619 +0000 UTC m=+3.051441073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gftvl\" (UniqueName: \"kubernetes.io/projected/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-kube-api-access-gftvl\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-sys-fs\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3ee3b59-dda6-4efc-944c-7d052a7a6c46-agent-certs\") pod \"konnectivity-agent-vj7h5\" (UID: \"e3ee3b59-dda6-4efc-944c-7d052a7a6c46\") " pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753250 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-run\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.753340 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-env-overrides\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-multus-certs\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bz6\" (UniqueName: \"kubernetes.io/projected/b040571c-3a7e-407f-8b6a-b70862f5b8c0-kube-api-access-q4bz6\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-tuned\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9d7w\" (UniqueName: \"kubernetes.io/projected/908435e9-5714-4471-be36-df61c28816d3-kube-api-access-j9d7w\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccf4efd3-2881-48aa-80c7-44d62f243db8-serviceca\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnlkv\" (UniqueName: \"kubernetes.io/projected/014c9a15-8c18-48c5-afe1-53f597a02932-kube-api-access-dnlkv\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/908435e9-5714-4471-be36-df61c28816d3-iptables-alerter-script\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cni-binary-copy\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovnkube-config\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753601 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-multus-certs\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753650 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cnibin\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2d80946-cce7-4f66-bd36-9b3413670c78-tmp\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cnibin\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzmgq\" (UniqueName: \"kubernetes.io/projected/f798ddd1-abe6-46fe-8f87-51eb8f211cba-kube-api-access-hzmgq\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.753934 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovnkube-script-lib\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4k9tq\" (UniqueName: \"kubernetes.io/projected/ccf4efd3-2881-48aa-80c7-44d62f243db8-kube-api-access-4k9tq\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzwf4\" (UniqueName: \"kubernetes.io/projected/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-kube-api-access-tzwf4\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-run-ovn-kubernetes\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.753955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-cni-netd\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-cni-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-os-release\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-hostroot\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-os-release\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-cni-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754199 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-os-release\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nc76\" (UniqueName: \"kubernetes.io/projected/b2d80946-cce7-4f66-bd36-9b3413670c78-kube-api-access-4nc76\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-hostroot\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754234 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-daemon-config\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-systemd-units\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.754668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754283 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-os-release\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-cni-bin\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-system-cni-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754380 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-etc-kubernetes\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-system-cni-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysctl-d\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-etc-kubernetes\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-systemd\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-host\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-cni-bin\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3ee3b59-dda6-4efc-944c-7d052a7a6c46-konnectivity-ca\") pod \"konnectivity-agent-vj7h5\" (UID: \"e3ee3b59-dda6-4efc-944c-7d052a7a6c46\") " pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-cni-bin\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-socket-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-registration-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.755282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/908435e9-5714-4471-be36-df61c28816d3-host-slash\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-log-socket\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-conf-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf4efd3-2881-48aa-80c7-44d62f243db8-host\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-modprobe-d\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-kubernetes\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-device-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-cni-multus\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-ovn\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-node-log\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-socket-dir-parent\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-system-cni-dir\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysconfig\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf4efd3-2881-48aa-80c7-44d62f243db8-host\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.754989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysctl-conf\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-systemd\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.755911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3ee3b59-dda6-4efc-944c-7d052a7a6c46-konnectivity-ca\") pod \"konnectivity-agent-vj7h5\" (UID: \"e3ee3b59-dda6-4efc-944c-7d052a7a6c46\") " pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755075 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-conf-dir\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755077 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-multus-socket-dir-parent\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-system-cni-dir\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-var-lib-cni-multus\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-kubelet\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-slash\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b040571c-3a7e-407f-8b6a-b70862f5b8c0-cni-binary-copy\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-netns\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.755335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b040571c-3a7e-407f-8b6a-b70862f5b8c0-host-run-netns\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.756608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.756198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b040571c-3a7e-407f-8b6a-b70862f5b8c0-cni-binary-copy\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.757187 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.756695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3ee3b59-dda6-4efc-944c-7d052a7a6c46-agent-certs\") pod \"konnectivity-agent-vj7h5\" (UID: \"e3ee3b59-dda6-4efc-944c-7d052a7a6c46\") " pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:24.762077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.762022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftvl\" (UniqueName: \"kubernetes.io/projected/00bd34ed-f0ff-40f8-bd12-a9c364e0fe82-kube-api-access-gftvl\") pod \"multus-additional-cni-plugins-kx29n\" (UID: \"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82\") " pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.762671 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.762649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k9tq\" (UniqueName: \"kubernetes.io/projected/ccf4efd3-2881-48aa-80c7-44d62f243db8-kube-api-access-4k9tq\") pod \"node-ca-ht997\" (UID: \"ccf4efd3-2881-48aa-80c7-44d62f243db8\") " pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.762871 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.762841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzwf4\" (UniqueName: \"kubernetes.io/projected/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-kube-api-access-tzwf4\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:24.762974 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.762948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bz6\" (UniqueName: \"kubernetes.io/projected/b040571c-3a7e-407f-8b6a-b70862f5b8c0-kube-api-access-q4bz6\") pod \"multus-dhsc2\" (UID: \"b040571c-3a7e-407f-8b6a-b70862f5b8c0\") " pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.777604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.777533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" event={"ID":"63ca270e6c7ab8749d592459a749af4f","Type":"ContainerStarted","Data":"3807dc3e60b4afa16521e386635a354de3b0ec400b91405cb6eca703388e7ed1"} Apr 23 08:14:24.779561 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.779533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" event={"ID":"9b0e838abf38c7589ab4ddfd6dec535f","Type":"ContainerStarted","Data":"2c670a2f1bd4dfb2520b6c5bfa2bec997510378249a60426a3f52d7657631b60"} Apr 23 08:14:24.855725 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.855694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9d7w\" (UniqueName: \"kubernetes.io/projected/908435e9-5714-4471-be36-df61c28816d3-kube-api-access-j9d7w\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.855879 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.855732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnlkv\" (UniqueName: \"kubernetes.io/projected/014c9a15-8c18-48c5-afe1-53f597a02932-kube-api-access-dnlkv\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.855879 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.855755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/908435e9-5714-4471-be36-df61c28816d3-iptables-alerter-script\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.855879 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.855777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovnkube-config\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.855879 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.855800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2d80946-cce7-4f66-bd36-9b3413670c78-tmp\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856083 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzmgq\" (UniqueName: \"kubernetes.io/projected/f798ddd1-abe6-46fe-8f87-51eb8f211cba-kube-api-access-hzmgq\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.856131 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovnkube-script-lib\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856131 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-run-ovn-kubernetes\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856206 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-cni-netd\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856206 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-run-ovn-kubernetes\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856206 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nc76\" (UniqueName: \"kubernetes.io/projected/b2d80946-cce7-4f66-bd36-9b3413670c78-kube-api-access-4nc76\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856350 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-systemd-units\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856350 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-cni-bin\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856350 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysctl-d\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856350 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-systemd\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856350 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-host\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856495 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.856495 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-socket-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.856495 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856378 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/908435e9-5714-4471-be36-df61c28816d3-iptables-alerter-script\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.856495 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-registration-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.856495 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/908435e9-5714-4471-be36-df61c28816d3-host-slash\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.856495 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovnkube-config\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856495 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-log-socket\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-modprobe-d\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-kubernetes\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-cni-netd\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-device-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysctl-d\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-log-socket\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856611 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/908435e9-5714-4471-be36-df61c28816d3-host-slash\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-ovn\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-systemd-units\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-registration-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856658 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-node-log\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-kubernetes\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-host\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-modprobe-d\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-cni-bin\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-device-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysconfig\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.856810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-ovn\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-systemd\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysconfig\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-node-log\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysctl-conf\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-socket-dir\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-systemd\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovnkube-script-lib\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-kubelet\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-slash\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-systemd\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856950 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-kubelet\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-sysctl-conf\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f798ddd1-abe6-46fe-8f87-51eb8f211cba-tmp-dir\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.856984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-slash\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-run-netns\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.857604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-etc-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovn-node-metrics-cert\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-lib-modules\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-etc-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-var-lib-kubelet\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-var-lib-kubelet\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f798ddd1-abe6-46fe-8f87-51eb8f211cba-hosts-file\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-lib-modules\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f798ddd1-abe6-46fe-8f87-51eb8f211cba-hosts-file\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-run-netns\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-run-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pprj\" (UniqueName: \"kubernetes.io/projected/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-kube-api-access-2pprj\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f798ddd1-abe6-46fe-8f87-51eb8f211cba-tmp-dir\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-sys\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-etc-selinux\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-sys\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-var-lib-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.858374 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-etc-selinux\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-sys-fs\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-run\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/014c9a15-8c18-48c5-afe1-53f597a02932-sys-fs\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-var-lib-openvswitch\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2d80946-cce7-4f66-bd36-9b3413670c78-run\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-env-overrides\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-tuned\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.857987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-env-overrides\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.859111 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.858204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2d80946-cce7-4f66-bd36-9b3413670c78-tmp\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.859540 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.859520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-ovn-node-metrics-cert\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.859826 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.859786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2d80946-cce7-4f66-bd36-9b3413670c78-etc-tuned\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.862380 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:24.862324 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:24.862380 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:24.862353 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:24.862380 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:24.862368 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dw8rp for pod openshift-network-diagnostics/network-check-target-b4f6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:24.862593 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:24.862420 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp podName:b566e32e-9149-4af2-a4dd-4dba40f00efa nodeName:}" failed. No retries permitted until 2026-04-23 08:14:25.362402604 +0000 UTC m=+3.160793067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dw8rp" (UniqueName: "kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp") pod "network-check-target-b4f6z" (UID: "b566e32e-9149-4af2-a4dd-4dba40f00efa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:24.864876 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.864819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pprj\" (UniqueName: \"kubernetes.io/projected/7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31-kube-api-access-2pprj\") pod \"ovnkube-node-xlcv7\" (UID: \"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31\") " pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:24.865068 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.865018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9d7w\" (UniqueName: \"kubernetes.io/projected/908435e9-5714-4471-be36-df61c28816d3-kube-api-access-j9d7w\") pod \"iptables-alerter-rtkxh\" (UID: \"908435e9-5714-4471-be36-df61c28816d3\") " pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:24.865068 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.865020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nc76\" (UniqueName: \"kubernetes.io/projected/b2d80946-cce7-4f66-bd36-9b3413670c78-kube-api-access-4nc76\") pod \"tuned-vrk6z\" (UID: \"b2d80946-cce7-4f66-bd36-9b3413670c78\") " pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.865353 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.865331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzmgq\" (UniqueName: \"kubernetes.io/projected/f798ddd1-abe6-46fe-8f87-51eb8f211cba-kube-api-access-hzmgq\") pod \"node-resolver-sx7f6\" (UID: \"f798ddd1-abe6-46fe-8f87-51eb8f211cba\") " pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.865427 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.865361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnlkv\" (UniqueName: \"kubernetes.io/projected/014c9a15-8c18-48c5-afe1-53f597a02932-kube-api-access-dnlkv\") pod \"aws-ebs-csi-driver-node-2w578\" (UID: \"014c9a15-8c18-48c5-afe1-53f597a02932\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:24.945093 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.945027 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dhsc2" Apr 23 08:14:24.958718 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.958682 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ht997" Apr 23 08:14:24.969263 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.969239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kx29n" Apr 23 08:14:24.975858 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.975833 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:24.983387 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.983369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" Apr 23 08:14:24.991965 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.991946 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sx7f6" Apr 23 08:14:24.998569 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:24.998553 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" Apr 23 08:14:25.007080 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.007063 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rtkxh" Apr 23 08:14:25.013685 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.013665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:25.260768 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.260539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:25.260891 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:25.260683 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:25.260891 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:25.260840 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:26.260824587 +0000 UTC m=+4.059215036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:25.425477 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.425450 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a42bc96_af2a_4e37_9f0b_3ef91d9e9e31.slice/crio-bc00845c8ece3666101054ad90d2af7c16969fba01641303d11d414002b7ce35 WatchSource:0}: Error finding container bc00845c8ece3666101054ad90d2af7c16969fba01641303d11d414002b7ce35: Status 404 returned error can't find the container with id bc00845c8ece3666101054ad90d2af7c16969fba01641303d11d414002b7ce35 Apr 23 08:14:25.426236 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.426216 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d80946_cce7_4f66_bd36_9b3413670c78.slice/crio-05f73c9ef599cb11b2a8f8e104b0b04a83d699de7899f8357d0c5ec2b365af10 WatchSource:0}: Error finding container 05f73c9ef599cb11b2a8f8e104b0b04a83d699de7899f8357d0c5ec2b365af10: Status 404 returned error can't find the container with id 05f73c9ef599cb11b2a8f8e104b0b04a83d699de7899f8357d0c5ec2b365af10 Apr 23 08:14:25.428162 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.428109 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccf4efd3_2881_48aa_80c7_44d62f243db8.slice/crio-eefb2f11fea9267e2c82b18da54fa26bf455a804139e78c3325d8632541d03f5 WatchSource:0}: Error finding container eefb2f11fea9267e2c82b18da54fa26bf455a804139e78c3325d8632541d03f5: Status 404 returned error can't find the container with id eefb2f11fea9267e2c82b18da54fa26bf455a804139e78c3325d8632541d03f5 Apr 23 08:14:25.431269 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.431241 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908435e9_5714_4471_be36_df61c28816d3.slice/crio-f54082c2ae4b65f626b20cb9ecad5e52f01e024a43e94faa2fb9fec9bc28a90d WatchSource:0}: Error finding container f54082c2ae4b65f626b20cb9ecad5e52f01e024a43e94faa2fb9fec9bc28a90d: Status 404 returned error can't find the container with id f54082c2ae4b65f626b20cb9ecad5e52f01e024a43e94faa2fb9fec9bc28a90d Apr 23 08:14:25.452786 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.452763 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod014c9a15_8c18_48c5_afe1_53f597a02932.slice/crio-9b4698d6ee7661e95df1cd3b781b937a637e9ae96d4115c0b8961a3bc7ae8426 WatchSource:0}: Error finding container 9b4698d6ee7661e95df1cd3b781b937a637e9ae96d4115c0b8961a3bc7ae8426: Status 404 returned error can't find the container with id 9b4698d6ee7661e95df1cd3b781b937a637e9ae96d4115c0b8961a3bc7ae8426 Apr 23 08:14:25.453828 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.453801 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ee3b59_dda6_4efc_944c_7d052a7a6c46.slice/crio-7f3ed11389ff559560127349c1f4ffcc760dbdeb87ceef560ed57c2ec0fc82f0 WatchSource:0}: Error finding container 7f3ed11389ff559560127349c1f4ffcc760dbdeb87ceef560ed57c2ec0fc82f0: Status 404 returned error can't find the container with id 7f3ed11389ff559560127349c1f4ffcc760dbdeb87ceef560ed57c2ec0fc82f0 Apr 23 08:14:25.454489 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.454469 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf798ddd1_abe6_46fe_8f87_51eb8f211cba.slice/crio-afc96e5d695ea78409026d29f1f3e3f2499611483587aec04beb8bb969e12577 WatchSource:0}: Error finding container afc96e5d695ea78409026d29f1f3e3f2499611483587aec04beb8bb969e12577: Status 404 returned error can't find the container with id afc96e5d695ea78409026d29f1f3e3f2499611483587aec04beb8bb969e12577 Apr 23 08:14:25.455968 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.455878 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb040571c_3a7e_407f_8b6a_b70862f5b8c0.slice/crio-92bf5d4362b7412ca2c58495d7ceae84d0d27cfe92c74e25232ea36e8f9716b8 WatchSource:0}: Error finding container 92bf5d4362b7412ca2c58495d7ceae84d0d27cfe92c74e25232ea36e8f9716b8: Status 404 returned error can't find the container with id 92bf5d4362b7412ca2c58495d7ceae84d0d27cfe92c74e25232ea36e8f9716b8 Apr 23 08:14:25.456758 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:14:25.456730 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00bd34ed_f0ff_40f8_bd12_a9c364e0fe82.slice/crio-a3873ca8a31eac850aee577b818c8f77f9b9f9314d778a9a189e4f0170c08333 WatchSource:0}: Error finding container a3873ca8a31eac850aee577b818c8f77f9b9f9314d778a9a189e4f0170c08333: Status 404 returned error can't find the container with id a3873ca8a31eac850aee577b818c8f77f9b9f9314d778a9a189e4f0170c08333 Apr 23 08:14:25.462452 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.462412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:25.462625 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:25.462597 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:25.462625 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:25.462621 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:25.462731 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:25.462634 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dw8rp for pod openshift-network-diagnostics/network-check-target-b4f6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:25.462731 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:25.462696 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp podName:b566e32e-9149-4af2-a4dd-4dba40f00efa nodeName:}" failed. No retries permitted until 2026-04-23 08:14:26.462678924 +0000 UTC m=+4.261069382 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dw8rp" (UniqueName: "kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp") pod "network-check-target-b4f6z" (UID: "b566e32e-9149-4af2-a4dd-4dba40f00efa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:25.680413 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.680376 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:09:23 +0000 UTC" deadline="2028-01-31 13:05:12.276715768 +0000 UTC" Apr 23 08:14:25.680413 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.680414 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15556h50m46.596305125s" Apr 23 08:14:25.785149 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.785044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" event={"ID":"b2d80946-cce7-4f66-bd36-9b3413670c78","Type":"ContainerStarted","Data":"05f73c9ef599cb11b2a8f8e104b0b04a83d699de7899f8357d0c5ec2b365af10"} Apr 23 08:14:25.787814 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.787784 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"bc00845c8ece3666101054ad90d2af7c16969fba01641303d11d414002b7ce35"} Apr 23 08:14:25.792636 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.792608 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dhsc2" event={"ID":"b040571c-3a7e-407f-8b6a-b70862f5b8c0","Type":"ContainerStarted","Data":"92bf5d4362b7412ca2c58495d7ceae84d0d27cfe92c74e25232ea36e8f9716b8"} Apr 23 08:14:25.795141 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.795077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vj7h5" event={"ID":"e3ee3b59-dda6-4efc-944c-7d052a7a6c46","Type":"ContainerStarted","Data":"7f3ed11389ff559560127349c1f4ffcc760dbdeb87ceef560ed57c2ec0fc82f0"} Apr 23 08:14:25.801026 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.800974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" event={"ID":"014c9a15-8c18-48c5-afe1-53f597a02932","Type":"ContainerStarted","Data":"9b4698d6ee7661e95df1cd3b781b937a637e9ae96d4115c0b8961a3bc7ae8426"} Apr 23 08:14:25.803253 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.803227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" event={"ID":"9b0e838abf38c7589ab4ddfd6dec535f","Type":"ContainerStarted","Data":"ba6405b39531572c94aa85d30ec282e1dbd75f36656ba4cc616b37fc2c70ae3b"} Apr 23 08:14:25.808265 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.808211 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerStarted","Data":"a3873ca8a31eac850aee577b818c8f77f9b9f9314d778a9a189e4f0170c08333"} Apr 23 08:14:25.812372 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.812348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sx7f6" event={"ID":"f798ddd1-abe6-46fe-8f87-51eb8f211cba","Type":"ContainerStarted","Data":"afc96e5d695ea78409026d29f1f3e3f2499611483587aec04beb8bb969e12577"} Apr 23 08:14:25.819039 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.818993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rtkxh" event={"ID":"908435e9-5714-4471-be36-df61c28816d3","Type":"ContainerStarted","Data":"f54082c2ae4b65f626b20cb9ecad5e52f01e024a43e94faa2fb9fec9bc28a90d"} Apr 23 08:14:25.824793 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:25.824762 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ht997" event={"ID":"ccf4efd3-2881-48aa-80c7-44d62f243db8","Type":"ContainerStarted","Data":"eefb2f11fea9267e2c82b18da54fa26bf455a804139e78c3325d8632541d03f5"} Apr 23 08:14:26.269793 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:26.269516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:26.269793 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:26.269699 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:26.269793 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:26.269773 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:28.269752047 +0000 UTC m=+6.068142499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:26.471128 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:26.471091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:26.471340 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:26.471317 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:26.471426 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:26.471343 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:26.471426 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:26.471355 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dw8rp for pod openshift-network-diagnostics/network-check-target-b4f6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:26.471426 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:26.471418 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp podName:b566e32e-9149-4af2-a4dd-4dba40f00efa nodeName:}" failed. No retries permitted until 2026-04-23 08:14:28.471398973 +0000 UTC m=+6.269789424 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dw8rp" (UniqueName: "kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp") pod "network-check-target-b4f6z" (UID: "b566e32e-9149-4af2-a4dd-4dba40f00efa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:26.776157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:26.775438 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:26.776157 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:26.775572 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:26.776157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:26.775661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:26.776157 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:26.775726 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:26.845623 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:26.845586 2575 generic.go:358] "Generic (PLEG): container finished" podID="63ca270e6c7ab8749d592459a749af4f" containerID="1c4e9119ae4361dfd98774af07abd81568838ab6180acb6f554bb8f616ba3a3e" exitCode=0 Apr 23 08:14:26.846251 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:26.846216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" event={"ID":"63ca270e6c7ab8749d592459a749af4f","Type":"ContainerDied","Data":"1c4e9119ae4361dfd98774af07abd81568838ab6180acb6f554bb8f616ba3a3e"} Apr 23 08:14:26.860547 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:26.858908 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-53.ec2.internal" podStartSLOduration=2.858892973 podStartE2EDuration="2.858892973s" podCreationTimestamp="2026-04-23 08:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:25.816509045 +0000 UTC m=+3.614899514" watchObservedRunningTime="2026-04-23 08:14:26.858892973 +0000 UTC m=+4.657283453" Apr 23 08:14:27.852954 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:27.852311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" event={"ID":"63ca270e6c7ab8749d592459a749af4f","Type":"ContainerStarted","Data":"eaa33ca4415f1303a83ecd3bd146165aa6ece49a888f2c3f9a39fe098b14d4d1"} Apr 23 08:14:27.868285 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:27.868226 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-53.ec2.internal" podStartSLOduration=3.86820704 podStartE2EDuration="3.86820704s" podCreationTimestamp="2026-04-23 08:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:27.867762894 +0000 UTC m=+5.666153364" watchObservedRunningTime="2026-04-23 08:14:27.86820704 +0000 UTC m=+5.666597510" Apr 23 08:14:28.290559 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:28.290467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:28.290723 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:28.290625 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:28.290723 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:28.290692 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:32.290671975 +0000 UTC m=+10.089062434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:28.491572 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:28.491529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:28.491777 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:28.491747 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:28.491777 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:28.491779 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:28.491919 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:28.491792 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dw8rp for pod openshift-network-diagnostics/network-check-target-b4f6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:28.491919 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:28.491850 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp podName:b566e32e-9149-4af2-a4dd-4dba40f00efa nodeName:}" failed. No retries permitted until 2026-04-23 08:14:32.491832029 +0000 UTC m=+10.290222487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dw8rp" (UniqueName: "kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp") pod "network-check-target-b4f6z" (UID: "b566e32e-9149-4af2-a4dd-4dba40f00efa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:28.773515 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:28.773406 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:28.773683 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:28.773580 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:28.774007 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:28.773989 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:28.774146 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:28.774100 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:30.773396 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:30.773285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:30.773396 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:30.773326 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:30.773886 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:30.773430 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:30.773886 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:30.773631 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:32.320626 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:32.320548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:32.321074 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:32.320707 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:32.321074 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:32.320781 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:40.320760214 +0000 UTC m=+18.119150673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:32.521968 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:32.521927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:32.522136 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:32.522117 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:32.522136 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:32.522134 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:32.522237 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:32.522143 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dw8rp for pod openshift-network-diagnostics/network-check-target-b4f6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:32.522237 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:32.522198 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp podName:b566e32e-9149-4af2-a4dd-4dba40f00efa nodeName:}" failed. No retries permitted until 2026-04-23 08:14:40.522179746 +0000 UTC m=+18.320570193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dw8rp" (UniqueName: "kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp") pod "network-check-target-b4f6z" (UID: "b566e32e-9149-4af2-a4dd-4dba40f00efa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:32.774119 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:32.774008 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:32.774306 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:32.774115 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:32.774521 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:32.774498 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:32.774632 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:32.774608 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:34.773197 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.772939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:34.773777 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:34.773243 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:34.773777 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.773013 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:34.773777 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:34.773368 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:34.867647 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.867260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerStarted","Data":"14c7c1672f2f8c6c59f0b54dc3f5ba3946a3c5934eb9248833eddaab523465ff"} Apr 23 08:14:34.869052 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.869016 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sx7f6" event={"ID":"f798ddd1-abe6-46fe-8f87-51eb8f211cba","Type":"ContainerStarted","Data":"04e1d3cd249ff45146aed0de70b91139e0b637de6bfe4f31e0a3cfc6ff370d08"} Apr 23 08:14:34.870698 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.870668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ht997" event={"ID":"ccf4efd3-2881-48aa-80c7-44d62f243db8","Type":"ContainerStarted","Data":"766c06e44d03ffdb7f6dfedb833654c527238c985a201202d404f93ef82641a3"} Apr 23 08:14:34.872310 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.872248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" event={"ID":"b2d80946-cce7-4f66-bd36-9b3413670c78","Type":"ContainerStarted","Data":"6b98898a69a23923906fedff98f7909032748847d89e50fdd2c058b6b61a1e21"} Apr 23 08:14:34.874005 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.873980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vj7h5" event={"ID":"e3ee3b59-dda6-4efc-944c-7d052a7a6c46","Type":"ContainerStarted","Data":"4734a9fd332cdd56cd1bb761d048bb8b51e92585cae5117f0481584aa8e33025"} Apr 23 08:14:34.875903 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.875880 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" event={"ID":"014c9a15-8c18-48c5-afe1-53f597a02932","Type":"ContainerStarted","Data":"c4af6df5a648bbaa7936c9aaaba1d9ad485bae130ebdaa06b7c87c14f98ff3aa"} Apr 23 08:14:34.906261 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.906215 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ht997" podStartSLOduration=4.373068521 podStartE2EDuration="12.906200717s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.430017361 +0000 UTC m=+3.228407805" lastFinishedPulling="2026-04-23 08:14:33.963149552 +0000 UTC m=+11.761540001" observedRunningTime="2026-04-23 08:14:34.905634641 +0000 UTC m=+12.704025110" watchObservedRunningTime="2026-04-23 08:14:34.906200717 +0000 UTC m=+12.704591185" Apr 23 08:14:34.921321 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.921266 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vj7h5" podStartSLOduration=4.402284918 podStartE2EDuration="12.921253788s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.455569733 +0000 UTC m=+3.253960178" lastFinishedPulling="2026-04-23 08:14:33.974538596 +0000 UTC m=+11.772929048" observedRunningTime="2026-04-23 08:14:34.920870166 +0000 UTC m=+12.719260654" watchObservedRunningTime="2026-04-23 08:14:34.921253788 +0000 UTC m=+12.719644254" Apr 23 08:14:34.935528 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.935479 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sx7f6" podStartSLOduration=4.421594266 podStartE2EDuration="12.935465101s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.459720393 +0000 UTC m=+3.258110844" lastFinishedPulling="2026-04-23 08:14:33.973591222 +0000 UTC m=+11.771981679" observedRunningTime="2026-04-23 08:14:34.935071178 +0000 UTC m=+12.733461649" watchObservedRunningTime="2026-04-23 08:14:34.935465101 +0000 UTC m=+12.733855570" Apr 23 08:14:34.956522 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:34.956482 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vrk6z" podStartSLOduration=4.392356264 podStartE2EDuration="12.956468723s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.428106533 +0000 UTC m=+3.226496987" lastFinishedPulling="2026-04-23 08:14:33.992218987 +0000 UTC m=+11.790609446" observedRunningTime="2026-04-23 08:14:34.95642815 +0000 UTC m=+12.754818640" watchObservedRunningTime="2026-04-23 08:14:34.956468723 +0000 UTC m=+12.754859191" Apr 23 08:14:35.879462 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:35.879191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rtkxh" event={"ID":"908435e9-5714-4471-be36-df61c28816d3","Type":"ContainerStarted","Data":"082a696331ec9782bdacade63193f84238ad7ad83c50ced027c9545c52dad8db"} Apr 23 08:14:35.897850 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:35.897792 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rtkxh" podStartSLOduration=5.376247257 podStartE2EDuration="13.8977745s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.451202431 +0000 UTC m=+3.249592879" lastFinishedPulling="2026-04-23 08:14:33.972729663 +0000 UTC m=+11.771120122" observedRunningTime="2026-04-23 08:14:35.897198451 +0000 UTC m=+13.695588918" watchObservedRunningTime="2026-04-23 08:14:35.8977745 +0000 UTC m=+13.696164950" Apr 23 08:14:36.772939 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:36.772905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:36.773100 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:36.772905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:36.773100 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:36.773038 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:36.773202 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:36.773113 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:37.070242 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:37.070168 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:37.071007 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:37.070823 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:38.773067 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:38.773032 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:38.773620 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:38.773165 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:38.773620 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:38.773230 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:38.773620 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:38.773341 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:38.875639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:38.875605 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9ffdf"] Apr 23 08:14:38.881836 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:38.881810 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:38.881974 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:38.881892 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:38.980397 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:38.980358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:38.980557 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:38.980422 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a6e9434c-f939-423a-8859-5a047d434c0c-kubelet-config\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:38.980557 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:38.980529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a6e9434c-f939-423a-8859-5a047d434c0c-dbus\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:39.081732 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:39.081649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a6e9434c-f939-423a-8859-5a047d434c0c-dbus\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:39.081732 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:39.081726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:39.081942 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:39.081761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a6e9434c-f939-423a-8859-5a047d434c0c-kubelet-config\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:39.081942 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:39.081829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a6e9434c-f939-423a-8859-5a047d434c0c-kubelet-config\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:39.081942 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:39.081872 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:39.081942 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:39.081938 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret podName:a6e9434c-f939-423a-8859-5a047d434c0c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:39.581923564 +0000 UTC m=+17.380314009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret") pod "global-pull-secret-syncer-9ffdf" (UID: "a6e9434c-f939-423a-8859-5a047d434c0c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:39.082094 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:39.081939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a6e9434c-f939-423a-8859-5a047d434c0c-dbus\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:39.586598 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:39.586389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:39.586788 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:39.586530 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:39.586788 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:39.586748 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret podName:a6e9434c-f939-423a-8859-5a047d434c0c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:40.586727136 +0000 UTC m=+18.385117603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret") pod "global-pull-secret-syncer-9ffdf" (UID: "a6e9434c-f939-423a-8859-5a047d434c0c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:39.886842 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:39.886812 2575 generic.go:358] "Generic (PLEG): container finished" podID="00bd34ed-f0ff-40f8-bd12-a9c364e0fe82" containerID="14c7c1672f2f8c6c59f0b54dc3f5ba3946a3c5934eb9248833eddaab523465ff" exitCode=0 Apr 23 08:14:39.887232 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:39.886873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerDied","Data":"14c7c1672f2f8c6c59f0b54dc3f5ba3946a3c5934eb9248833eddaab523465ff"} Apr 23 08:14:40.393420 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:40.393382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:40.393590 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.393545 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:40.393635 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.393618 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:56.393601963 +0000 UTC m=+34.191992411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:40.594979 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:40.594946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:40.595253 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:40.594998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:40.595253 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.595103 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:40.595253 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.595114 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:40.595253 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.595132 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:40.595253 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.595144 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dw8rp for pod openshift-network-diagnostics/network-check-target-b4f6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:40.595253 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.595172 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret podName:a6e9434c-f939-423a-8859-5a047d434c0c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:42.595154065 +0000 UTC m=+20.393544524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret") pod "global-pull-secret-syncer-9ffdf" (UID: "a6e9434c-f939-423a-8859-5a047d434c0c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:40.595253 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.595192 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp podName:b566e32e-9149-4af2-a4dd-4dba40f00efa nodeName:}" failed. No retries permitted until 2026-04-23 08:14:56.595183003 +0000 UTC m=+34.393573458 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dw8rp" (UniqueName: "kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp") pod "network-check-target-b4f6z" (UID: "b566e32e-9149-4af2-a4dd-4dba40f00efa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:40.775440 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:40.775373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:40.775440 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:40.775395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:40.775440 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:40.775373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:40.775656 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.775491 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:40.775656 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.775548 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:40.775656 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:40.775635 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:42.606916 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:42.606884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:42.607269 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:42.607018 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:42.607269 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:42.607077 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret podName:a6e9434c-f939-423a-8859-5a047d434c0c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:46.607063696 +0000 UTC m=+24.405454147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret") pod "global-pull-secret-syncer-9ffdf" (UID: "a6e9434c-f939-423a-8859-5a047d434c0c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:42.773551 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:42.773524 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:42.773679 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:42.773598 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:42.773728 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:42.773680 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:42.773796 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:42.773777 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:42.773829 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:42.773821 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:42.773887 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:42.773875 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:43.557415 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.557152 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:14:43.716392 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.716279 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:14:43.557397231Z","UUID":"50d0601b-9e5c-4ede-b79a-0951e0a12b7b","Handler":null,"Name":"","Endpoint":""} Apr 23 08:14:43.718379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.718342 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:14:43.718379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.718369 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:14:43.894593 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.894517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dhsc2" event={"ID":"b040571c-3a7e-407f-8b6a-b70862f5b8c0","Type":"ContainerStarted","Data":"b6df82233654772af634bb74b6f122a4e93f8b5639edffbf667b01c9619ac3bb"} Apr 23 08:14:43.896512 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.896475 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" event={"ID":"014c9a15-8c18-48c5-afe1-53f597a02932","Type":"ContainerStarted","Data":"f2d6ea3efb1a2ca4ec4511cf239a0f92ac9f055fd6f88062adc3edbbc86e52f3"} Apr 23 08:14:43.899330 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.899308 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"0b1575a1b0af433c28172b6fe89b871f4db48deb0fc336d99657118ea64fa21a"} Apr 23 08:14:43.899330 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.899332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"e816b8084902dc6c64cf1172124a82b51f0f48a0c2521838ca6acf1ec58d32de"} Apr 23 08:14:43.899523 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.899346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"e2494016e0502443c3160ef56595e1a74aa302d555b6b3b762763c0ac981db41"} Apr 23 08:14:43.899523 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.899358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"e703abfeeea2498a70ae58f87f1bacff5e14fcaa85a740c80abb37b807252dd0"} Apr 23 08:14:43.899523 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.899366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"342459e806a269c252ea8d409c716d5e24b3190ed07b693d1bd354dd773cd1b7"} Apr 23 08:14:43.899523 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.899373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"e8e7513bab2169f36f1ce5d66539120fa482046841d10230deaab7d0938afb31"} Apr 23 08:14:43.912182 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:43.912145 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dhsc2" podStartSLOduration=4.258565574 podStartE2EDuration="21.912134166s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.459935394 +0000 UTC m=+3.258325853" lastFinishedPulling="2026-04-23 08:14:43.113503986 +0000 UTC m=+20.911894445" observedRunningTime="2026-04-23 08:14:43.911864382 +0000 UTC m=+21.710254850" watchObservedRunningTime="2026-04-23 08:14:43.912134166 +0000 UTC m=+21.710524632" Apr 23 08:14:44.773579 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:44.773549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:44.774039 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:44.773549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:44.774039 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:44.773681 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:44.774039 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:44.773807 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:44.774039 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:44.773554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:44.774039 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:44.773910 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:44.903285 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:44.903249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" event={"ID":"014c9a15-8c18-48c5-afe1-53f597a02932","Type":"ContainerStarted","Data":"0ce7d237a7f5ffac3abbcd2f8390b87a7eb4a6658d59645aafd235114660bd4d"} Apr 23 08:14:44.919876 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:44.919832 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2w578" podStartSLOduration=3.681068726 podStartE2EDuration="22.919819068s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.45440713 +0000 UTC m=+3.252797589" lastFinishedPulling="2026-04-23 08:14:44.69315748 +0000 UTC m=+22.491547931" observedRunningTime="2026-04-23 08:14:44.919728355 +0000 UTC m=+22.718118822" watchObservedRunningTime="2026-04-23 08:14:44.919819068 +0000 UTC m=+22.718209534" Apr 23 08:14:46.641714 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:46.641679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:46.642091 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:46.641818 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:46.642091 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:46.641880 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret podName:a6e9434c-f939-423a-8859-5a047d434c0c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:54.641864536 +0000 UTC m=+32.440254986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret") pod "global-pull-secret-syncer-9ffdf" (UID: "a6e9434c-f939-423a-8859-5a047d434c0c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:46.773602 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:46.773543 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:46.773730 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:46.773641 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:46.773730 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:46.773543 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:46.773730 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:46.773683 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:46.773872 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:46.773733 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:46.773926 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:46.773853 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:46.910025 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:46.909988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"8ddab4e110c36e841ebbf2c3f58ccc3c7c09230a349b66c0b7af2c3367794872"} Apr 23 08:14:47.912955 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:47.912918 2575 generic.go:358] "Generic (PLEG): container finished" podID="00bd34ed-f0ff-40f8-bd12-a9c364e0fe82" containerID="35d7c67f025643a9d60254ff7e8aed56e7e3cdb3d8c33775c58cece9daa55fb4" exitCode=0 Apr 23 08:14:47.913406 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:47.912977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerDied","Data":"35d7c67f025643a9d60254ff7e8aed56e7e3cdb3d8c33775c58cece9daa55fb4"} Apr 23 08:14:48.703946 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.703924 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:48.704066 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.704048 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:14:48.704816 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.704692 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vj7h5" Apr 23 08:14:48.773100 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.773036 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:48.773100 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.773054 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:48.773332 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:48.773142 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:48.773332 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:48.773207 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:48.773332 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.773237 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:48.773332 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:48.773326 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:48.917588 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.917546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" event={"ID":"7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31","Type":"ContainerStarted","Data":"689da7393be6ad49db071c0377d94defb084b0e0d9c296a37533cfe0bb9cafd4"} Apr 23 08:14:48.917997 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.917832 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:48.917997 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.917856 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:48.932549 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.932520 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:48.971700 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:48.971651 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" podStartSLOduration=9.33051898 podStartE2EDuration="26.971637067s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.428096977 +0000 UTC m=+3.226487421" lastFinishedPulling="2026-04-23 08:14:43.069215062 +0000 UTC m=+20.867605508" observedRunningTime="2026-04-23 08:14:48.944371698 +0000 UTC m=+26.742762165" watchObservedRunningTime="2026-04-23 08:14:48.971637067 +0000 UTC m=+26.770027534" Apr 23 08:14:49.921598 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:49.921563 2575 generic.go:358] "Generic (PLEG): container finished" podID="00bd34ed-f0ff-40f8-bd12-a9c364e0fe82" containerID="048c95ad2e77348fb629633f595457bffb55b9a48d0fb1330f796519f8ed461d" exitCode=0 Apr 23 08:14:49.922049 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:49.921646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerDied","Data":"048c95ad2e77348fb629633f595457bffb55b9a48d0fb1330f796519f8ed461d"} Apr 23 08:14:49.922117 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:49.922093 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:49.936784 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:49.936763 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:14:50.720482 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:50.720442 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9ffdf"] Apr 23 08:14:50.720697 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:50.720580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:50.720748 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:50.720687 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:50.721200 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:50.721178 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b4f6z"] Apr 23 08:14:50.721318 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:50.721274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:50.721408 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:50.721384 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:50.721818 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:50.721789 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gtsb8"] Apr 23 08:14:50.721891 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:50.721878 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:50.721998 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:50.721977 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:51.927939 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:51.927769 2575 generic.go:358] "Generic (PLEG): container finished" podID="00bd34ed-f0ff-40f8-bd12-a9c364e0fe82" containerID="e601ff058c0bb5675b4d836e07b4a27503553c2545a8f18f5d49c58269d6adc4" exitCode=0 Apr 23 08:14:51.928337 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:51.927840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerDied","Data":"e601ff058c0bb5675b4d836e07b4a27503553c2545a8f18f5d49c58269d6adc4"} Apr 23 08:14:52.774349 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:52.774320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:52.774494 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:52.774432 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:52.774494 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:52.774463 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:52.774611 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:52.774522 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:52.774611 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:52.774530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:52.774688 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:52.774618 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:54.709082 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:54.709044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:54.709652 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:54.709198 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:54.709652 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:54.709270 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret podName:a6e9434c-f939-423a-8859-5a047d434c0c nodeName:}" failed. No retries permitted until 2026-04-23 08:15:10.70925517 +0000 UTC m=+48.507645619 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret") pod "global-pull-secret-syncer-9ffdf" (UID: "a6e9434c-f939-423a-8859-5a047d434c0c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:54.773620 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:54.773587 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:54.773784 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:54.773588 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:54.773784 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:54.773736 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:54.773784 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:54.773768 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:54.773954 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:54.773848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:54.773954 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:54.773928 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:56.421821 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:56.421776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:56.422436 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.421944 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:56.422436 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.422019 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:28.421996493 +0000 UTC m=+66.220386959 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:56.623720 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:56.623678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:56.623855 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.623841 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:56.623912 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.623863 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:56.623912 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.623876 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dw8rp for pod openshift-network-diagnostics/network-check-target-b4f6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:56.624000 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.623944 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp podName:b566e32e-9149-4af2-a4dd-4dba40f00efa nodeName:}" failed. No retries permitted until 2026-04-23 08:15:28.62392259 +0000 UTC m=+66.422313040 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dw8rp" (UniqueName: "kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp") pod "network-check-target-b4f6z" (UID: "b566e32e-9149-4af2-a4dd-4dba40f00efa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:56.773102 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:56.773015 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:56.773102 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:56.773056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:56.773102 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:56.773049 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:56.773382 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.773182 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9ffdf" podUID="a6e9434c-f939-423a-8859-5a047d434c0c" Apr 23 08:14:56.773382 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.773266 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:14:56.773382 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:56.773349 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4f6z" podUID="b566e32e-9149-4af2-a4dd-4dba40f00efa" Apr 23 08:14:57.055842 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.055768 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeReady" Apr 23 08:14:57.055979 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.055908 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:14:57.090522 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.090487 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55b55655f8-zwssm"] Apr 23 08:14:57.133627 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.133595 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5r722"] Apr 23 08:14:57.133803 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.133774 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.143553 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.143532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:14:57.143814 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.143797 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fw7pg\"" Apr 23 08:14:57.143940 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.143798 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:14:57.144382 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.144358 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:14:57.147048 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.147018 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z4nr6"] Apr 23 08:14:57.147199 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.147181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.150731 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.150710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:14:57.151223 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.151199 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f2cmm\"" Apr 23 08:14:57.151518 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.151497 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:14:57.163573 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.163525 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:14:57.170350 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.170326 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55b55655f8-zwssm"] Apr 23 08:14:57.170350 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.170352 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5r722"] Apr 23 08:14:57.170461 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.170364 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z4nr6"] Apr 23 08:14:57.170505 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.170468 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:14:57.172824 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.172803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:14:57.172931 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.172807 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkwh7\"" Apr 23 08:14:57.173021 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.172997 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:14:57.173130 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.173108 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:14:57.227563 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.227531 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbg67\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-kube-api-access-dbg67\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.227563 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.227576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-bound-sa-token\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.227789 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.227704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-image-registry-private-configuration\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.227789 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.227740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-trusted-ca\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.227789 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.227775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-certificates\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.227940 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.227803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.227940 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.227829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-installation-pull-secrets\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.227940 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.227861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf15dd93-dbaa-4c02-82cd-5156525a67a9-ca-trust-extracted\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.329103 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-image-registry-private-configuration\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.329103 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9phfd\" (UniqueName: \"kubernetes.io/projected/8a62b026-adde-4674-a052-cc9aa72e0a2a-kube-api-access-9phfd\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.329103 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-trusted-ca\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.329103 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:14:57.329429 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94hvg\" (UniqueName: \"kubernetes.io/projected/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-kube-api-access-94hvg\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:14:57.329429 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.329429 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-certificates\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.329429 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.329429 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.329382 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:57.329429 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-installation-pull-secrets\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.329677 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf15dd93-dbaa-4c02-82cd-5156525a67a9-ca-trust-extracted\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.329677 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.329399 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55b55655f8-zwssm: secret "image-registry-tls" not found Apr 23 08:14:57.329677 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbg67\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-kube-api-access-dbg67\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.329677 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329522 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a62b026-adde-4674-a052-cc9aa72e0a2a-config-volume\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.329677 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329548 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a62b026-adde-4674-a052-cc9aa72e0a2a-tmp-dir\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.329677 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.329567 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls podName:cf15dd93-dbaa-4c02-82cd-5156525a67a9 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:57.829541644 +0000 UTC m=+35.627932099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls") pod "image-registry-55b55655f8-zwssm" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9") : secret "image-registry-tls" not found Apr 23 08:14:57.329677 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-bound-sa-token\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.330002 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-certificates\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.330002 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.329975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf15dd93-dbaa-4c02-82cd-5156525a67a9-ca-trust-extracted\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.330189 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.330165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-trusted-ca\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.333488 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.333466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-installation-pull-secrets\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.333625 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.333548 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-image-registry-private-configuration\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.340788 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.340766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-bound-sa-token\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.340929 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.340910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbg67\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-kube-api-access-dbg67\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.430947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.430906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a62b026-adde-4674-a052-cc9aa72e0a2a-config-volume\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.430954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a62b026-adde-4674-a052-cc9aa72e0a2a-tmp-dir\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.431032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9phfd\" (UniqueName: \"kubernetes.io/projected/8a62b026-adde-4674-a052-cc9aa72e0a2a-kube-api-access-9phfd\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.431060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.431086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94hvg\" (UniqueName: \"kubernetes.io/projected/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-kube-api-access-94hvg\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.431114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.431212 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.431279 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls podName:8a62b026-adde-4674-a052-cc9aa72e0a2a nodeName:}" failed. No retries permitted until 2026-04-23 08:14:57.931260024 +0000 UTC m=+35.729650483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls") pod "dns-default-5r722" (UID: "8a62b026-adde-4674-a052-cc9aa72e0a2a") : secret "dns-default-metrics-tls" not found Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.431362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a62b026-adde-4674-a052-cc9aa72e0a2a-tmp-dir\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.431442 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:57.431548 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.431478 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert podName:5c18ff90-80b0-4fd9-a24d-ee6d39d729b7 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:57.931466822 +0000 UTC m=+35.729857280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert") pod "ingress-canary-z4nr6" (UID: "5c18ff90-80b0-4fd9-a24d-ee6d39d729b7") : secret "canary-serving-cert" not found Apr 23 08:14:57.431913 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.431590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a62b026-adde-4674-a052-cc9aa72e0a2a-config-volume\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.440364 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.440340 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9phfd\" (UniqueName: \"kubernetes.io/projected/8a62b026-adde-4674-a052-cc9aa72e0a2a-kube-api-access-9phfd\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.440474 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.440347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94hvg\" (UniqueName: \"kubernetes.io/projected/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-kube-api-access-94hvg\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:14:57.834357 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.834323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:57.834512 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.834423 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:57.834512 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.834435 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55b55655f8-zwssm: secret "image-registry-tls" not found Apr 23 08:14:57.834512 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.834482 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls podName:cf15dd93-dbaa-4c02-82cd-5156525a67a9 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:58.834469197 +0000 UTC m=+36.632859642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls") pod "image-registry-55b55655f8-zwssm" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9") : secret "image-registry-tls" not found Apr 23 08:14:57.935109 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.935075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:14:57.935109 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:57.935111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:57.935289 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.935215 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:57.935289 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.935217 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:57.935289 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.935275 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert podName:5c18ff90-80b0-4fd9-a24d-ee6d39d729b7 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:58.935258738 +0000 UTC m=+36.733649183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert") pod "ingress-canary-z4nr6" (UID: "5c18ff90-80b0-4fd9-a24d-ee6d39d729b7") : secret "canary-serving-cert" not found Apr 23 08:14:57.935289 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:57.935288 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls podName:8a62b026-adde-4674-a052-cc9aa72e0a2a nodeName:}" failed. No retries permitted until 2026-04-23 08:14:58.935282242 +0000 UTC m=+36.733672687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls") pod "dns-default-5r722" (UID: "8a62b026-adde-4674-a052-cc9aa72e0a2a") : secret "dns-default-metrics-tls" not found Apr 23 08:14:58.773199 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.773169 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:14:58.773599 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.773213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:14:58.773599 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.773216 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:14:58.776064 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.776045 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:14:58.776186 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.776076 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:14:58.776186 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.776098 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:14:58.777282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.777265 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:14:58.777364 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.777350 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p5rcb\"" Apr 23 08:14:58.777401 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.777362 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vjj4z\"" Apr 23 08:14:58.840711 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.840688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:14:58.840834 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:58.840817 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:58.840875 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:58.840836 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55b55655f8-zwssm: secret "image-registry-tls" not found Apr 23 08:14:58.840912 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:58.840886 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls podName:cf15dd93-dbaa-4c02-82cd-5156525a67a9 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:00.840869646 +0000 UTC m=+38.639260091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls") pod "image-registry-55b55655f8-zwssm" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9") : secret "image-registry-tls" not found Apr 23 08:14:58.941081 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.941051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:14:58.941256 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:58.941183 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:58.941256 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.941198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:14:58.941256 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:58.941237 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls podName:8a62b026-adde-4674-a052-cc9aa72e0a2a nodeName:}" failed. No retries permitted until 2026-04-23 08:15:00.941223638 +0000 UTC m=+38.739614088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls") pod "dns-default-5r722" (UID: "8a62b026-adde-4674-a052-cc9aa72e0a2a") : secret "dns-default-metrics-tls" not found Apr 23 08:14:58.941457 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:58.941305 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:58.941457 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:14:58.941369 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert podName:5c18ff90-80b0-4fd9-a24d-ee6d39d729b7 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:00.94135349 +0000 UTC m=+38.739743951 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert") pod "ingress-canary-z4nr6" (UID: "5c18ff90-80b0-4fd9-a24d-ee6d39d729b7") : secret "canary-serving-cert" not found Apr 23 08:14:58.943195 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.943169 2575 generic.go:358] "Generic (PLEG): container finished" podID="00bd34ed-f0ff-40f8-bd12-a9c364e0fe82" containerID="df23ddc7ced39141e5af6e6dbd327123c8049e3ad5de300341f11ef66fce08dd" exitCode=0 Apr 23 08:14:58.943303 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:58.943235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerDied","Data":"df23ddc7ced39141e5af6e6dbd327123c8049e3ad5de300341f11ef66fce08dd"} Apr 23 08:14:59.947821 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:59.947790 2575 generic.go:358] "Generic (PLEG): container finished" podID="00bd34ed-f0ff-40f8-bd12-a9c364e0fe82" containerID="76fc651d3b6d4c19f81211a71b1bbc5d0169e35d642df63bfa42fd58aa623405" exitCode=0 Apr 23 08:14:59.948167 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:14:59.947835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerDied","Data":"76fc651d3b6d4c19f81211a71b1bbc5d0169e35d642df63bfa42fd58aa623405"} Apr 23 08:15:00.855879 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:00.855833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:15:00.856055 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:00.855986 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:15:00.856055 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:00.856005 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55b55655f8-zwssm: secret "image-registry-tls" not found Apr 23 08:15:00.856124 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:00.856067 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls podName:cf15dd93-dbaa-4c02-82cd-5156525a67a9 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:04.856048949 +0000 UTC m=+42.654439418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls") pod "image-registry-55b55655f8-zwssm" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9") : secret "image-registry-tls" not found Apr 23 08:15:00.952533 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:00.952499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx29n" event={"ID":"00bd34ed-f0ff-40f8-bd12-a9c364e0fe82","Type":"ContainerStarted","Data":"3378d50ed0e5ca6c7e7584644b8ea2b76e1a7a2eea8dbf5b01f5ecdb3b4ec0de"} Apr 23 08:15:00.956434 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:00.956415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:15:00.956501 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:00.956444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:15:00.956556 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:00.956543 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:00.956602 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:00.956594 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls podName:8a62b026-adde-4674-a052-cc9aa72e0a2a nodeName:}" failed. No retries permitted until 2026-04-23 08:15:04.956581458 +0000 UTC m=+42.754971902 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls") pod "dns-default-5r722" (UID: "8a62b026-adde-4674-a052-cc9aa72e0a2a") : secret "dns-default-metrics-tls" not found Apr 23 08:15:00.956644 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:00.956543 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:00.956675 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:00.956668 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert podName:5c18ff90-80b0-4fd9-a24d-ee6d39d729b7 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:04.956652708 +0000 UTC m=+42.755043165 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert") pod "ingress-canary-z4nr6" (UID: "5c18ff90-80b0-4fd9-a24d-ee6d39d729b7") : secret "canary-serving-cert" not found Apr 23 08:15:00.979845 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:00.979771 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kx29n" podStartSLOduration=6.460974865 podStartE2EDuration="38.979759091s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:25.459857205 +0000 UTC m=+3.258247650" lastFinishedPulling="2026-04-23 08:14:57.97864143 +0000 UTC m=+35.777031876" observedRunningTime="2026-04-23 08:15:00.978696477 +0000 UTC m=+38.777086945" watchObservedRunningTime="2026-04-23 08:15:00.979759091 +0000 UTC m=+38.778149557" Apr 23 08:15:04.882892 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:04.882845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:15:04.883262 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:04.882996 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:15:04.883262 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:04.883009 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55b55655f8-zwssm: secret "image-registry-tls" not found Apr 23 08:15:04.883262 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:04.883063 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls podName:cf15dd93-dbaa-4c02-82cd-5156525a67a9 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:12.883045303 +0000 UTC m=+50.681435771 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls") pod "image-registry-55b55655f8-zwssm" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9") : secret "image-registry-tls" not found Apr 23 08:15:04.983890 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:04.983859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:15:04.983890 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:04.983893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:15:04.984074 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:04.983995 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:04.984074 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:04.984043 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:04.984136 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:04.984074 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert podName:5c18ff90-80b0-4fd9-a24d-ee6d39d729b7 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:12.984056036 +0000 UTC m=+50.782446496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert") pod "ingress-canary-z4nr6" (UID: "5c18ff90-80b0-4fd9-a24d-ee6d39d729b7") : secret "canary-serving-cert" not found Apr 23 08:15:04.984136 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:04.984090 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls podName:8a62b026-adde-4674-a052-cc9aa72e0a2a nodeName:}" failed. No retries permitted until 2026-04-23 08:15:12.984084174 +0000 UTC m=+50.782474620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls") pod "dns-default-5r722" (UID: "8a62b026-adde-4674-a052-cc9aa72e0a2a") : secret "dns-default-metrics-tls" not found Apr 23 08:15:10.727613 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:10.727568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:15:10.730659 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:10.730632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e9434c-f939-423a-8859-5a047d434c0c-original-pull-secret\") pod \"global-pull-secret-syncer-9ffdf\" (UID: \"a6e9434c-f939-423a-8859-5a047d434c0c\") " pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:15:10.783110 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:10.783079 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9ffdf" Apr 23 08:15:10.903658 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:10.903626 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9ffdf"] Apr 23 08:15:10.907393 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:15:10.907354 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e9434c_f939_423a_8859_5a047d434c0c.slice/crio-679fead2df2613081d2ba31a4390319a40d8badaaf681cbc5e0e6ade0c68ff6b WatchSource:0}: Error finding container 679fead2df2613081d2ba31a4390319a40d8badaaf681cbc5e0e6ade0c68ff6b: Status 404 returned error can't find the container with id 679fead2df2613081d2ba31a4390319a40d8badaaf681cbc5e0e6ade0c68ff6b Apr 23 08:15:10.974095 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:10.974059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9ffdf" event={"ID":"a6e9434c-f939-423a-8859-5a047d434c0c","Type":"ContainerStarted","Data":"679fead2df2613081d2ba31a4390319a40d8badaaf681cbc5e0e6ade0c68ff6b"} Apr 23 08:15:12.942865 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:12.942821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:15:12.943391 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:12.942992 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:15:12.943391 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:12.943012 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55b55655f8-zwssm: secret "image-registry-tls" not found Apr 23 08:15:12.943391 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:12.943077 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls podName:cf15dd93-dbaa-4c02-82cd-5156525a67a9 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:28.943057566 +0000 UTC m=+66.741448033 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls") pod "image-registry-55b55655f8-zwssm" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9") : secret "image-registry-tls" not found Apr 23 08:15:13.043911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:13.043876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:15:13.043911 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:13.043912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:15:13.044121 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:13.044023 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:13.044121 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:13.044093 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert podName:5c18ff90-80b0-4fd9-a24d-ee6d39d729b7 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:29.044075743 +0000 UTC m=+66.842466201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert") pod "ingress-canary-z4nr6" (UID: "5c18ff90-80b0-4fd9-a24d-ee6d39d729b7") : secret "canary-serving-cert" not found Apr 23 08:15:13.044212 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:13.044023 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:13.044212 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:13.044201 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls podName:8a62b026-adde-4674-a052-cc9aa72e0a2a nodeName:}" failed. No retries permitted until 2026-04-23 08:15:29.044180596 +0000 UTC m=+66.842571045 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls") pod "dns-default-5r722" (UID: "8a62b026-adde-4674-a052-cc9aa72e0a2a") : secret "dns-default-metrics-tls" not found Apr 23 08:15:15.984741 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:15.984707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9ffdf" event={"ID":"a6e9434c-f939-423a-8859-5a047d434c0c","Type":"ContainerStarted","Data":"c00fa38f6094ff910b88bd2e404bd7d5e19dbd453dcd6873a30aca567fafc894"} Apr 23 08:15:15.999984 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:15.999943 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9ffdf" podStartSLOduration=33.867367036 podStartE2EDuration="37.999931192s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:15:10.909038623 +0000 UTC m=+48.707429068" lastFinishedPulling="2026-04-23 08:15:15.041602779 +0000 UTC m=+52.839993224" observedRunningTime="2026-04-23 08:15:15.999505609 +0000 UTC m=+53.797896076" watchObservedRunningTime="2026-04-23 08:15:15.999931192 +0000 UTC m=+53.798321636" Apr 23 08:15:21.943841 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:21.943813 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xlcv7" Apr 23 08:15:28.455273 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.455229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:15:28.457936 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.457919 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:15:28.466019 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:28.465990 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:15:28.466076 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:28.466064 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:32.466042467 +0000 UTC m=+130.264432911 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : secret "metrics-daemon-secret" not found Apr 23 08:15:28.657319 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.657246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:15:28.660349 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.660328 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:15:28.670286 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.670262 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:15:28.680641 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.680621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw8rp\" (UniqueName: \"kubernetes.io/projected/b566e32e-9149-4af2-a4dd-4dba40f00efa-kube-api-access-dw8rp\") pod \"network-check-target-b4f6z\" (UID: \"b566e32e-9149-4af2-a4dd-4dba40f00efa\") " pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:15:28.796414 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.796324 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p5rcb\"" Apr 23 08:15:28.803510 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.803476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:15:28.917876 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.917844 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b4f6z"] Apr 23 08:15:28.922354 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:15:28.922324 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb566e32e_9149_4af2_a4dd_4dba40f00efa.slice/crio-5c1bd134fe7ce6ddc4e08358527e7cba183b65893b9b497c62267595cf34e54b WatchSource:0}: Error finding container 5c1bd134fe7ce6ddc4e08358527e7cba183b65893b9b497c62267595cf34e54b: Status 404 returned error can't find the container with id 5c1bd134fe7ce6ddc4e08358527e7cba183b65893b9b497c62267595cf34e54b Apr 23 08:15:28.959557 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:28.959517 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:15:28.959768 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:28.959687 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:15:28.959768 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:28.959709 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55b55655f8-zwssm: secret "image-registry-tls" not found Apr 23 08:15:28.959877 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:28.959796 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls podName:cf15dd93-dbaa-4c02-82cd-5156525a67a9 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:00.959773789 +0000 UTC m=+98.758164248 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls") pod "image-registry-55b55655f8-zwssm" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9") : secret "image-registry-tls" not found Apr 23 08:15:29.011796 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:29.011760 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b4f6z" event={"ID":"b566e32e-9149-4af2-a4dd-4dba40f00efa","Type":"ContainerStarted","Data":"5c1bd134fe7ce6ddc4e08358527e7cba183b65893b9b497c62267595cf34e54b"} Apr 23 08:15:29.060643 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:29.060556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:15:29.060643 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:29.060599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:15:29.060843 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:29.060702 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:29.060843 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:29.060720 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:29.060843 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:29.060763 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls podName:8a62b026-adde-4674-a052-cc9aa72e0a2a nodeName:}" failed. No retries permitted until 2026-04-23 08:16:01.060746321 +0000 UTC m=+98.859136767 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls") pod "dns-default-5r722" (UID: "8a62b026-adde-4674-a052-cc9aa72e0a2a") : secret "dns-default-metrics-tls" not found Apr 23 08:15:29.060843 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:15:29.060779 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert podName:5c18ff90-80b0-4fd9-a24d-ee6d39d729b7 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:01.060772495 +0000 UTC m=+98.859162939 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert") pod "ingress-canary-z4nr6" (UID: "5c18ff90-80b0-4fd9-a24d-ee6d39d729b7") : secret "canary-serving-cert" not found Apr 23 08:15:32.018665 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:32.018628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b4f6z" event={"ID":"b566e32e-9149-4af2-a4dd-4dba40f00efa","Type":"ContainerStarted","Data":"bdc0d27efe3a8ea98a6a29740ce36cd239249a80704e5fad5d64209ade1c6d13"} Apr 23 08:15:32.019139 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:15:32.018760 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:16:00.989766 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:00.989725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:16:00.990174 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:00.989877 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:16:00.990174 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:00.989895 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55b55655f8-zwssm: secret "image-registry-tls" not found Apr 23 08:16:00.990174 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:00.989959 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls podName:cf15dd93-dbaa-4c02-82cd-5156525a67a9 nodeName:}" failed. No retries permitted until 2026-04-23 08:17:04.989943525 +0000 UTC m=+162.788333970 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls") pod "image-registry-55b55655f8-zwssm" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9") : secret "image-registry-tls" not found Apr 23 08:16:01.090182 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:01.090153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:16:01.090182 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:01.090183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:16:01.090366 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:01.090276 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:16:01.090366 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:01.090285 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:16:01.090366 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:01.090339 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls podName:8a62b026-adde-4674-a052-cc9aa72e0a2a nodeName:}" failed. No retries permitted until 2026-04-23 08:17:05.090323253 +0000 UTC m=+162.888713701 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls") pod "dns-default-5r722" (UID: "8a62b026-adde-4674-a052-cc9aa72e0a2a") : secret "dns-default-metrics-tls" not found Apr 23 08:16:01.090366 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:01.090357 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert podName:5c18ff90-80b0-4fd9-a24d-ee6d39d729b7 nodeName:}" failed. No retries permitted until 2026-04-23 08:17:05.090345202 +0000 UTC m=+162.888735646 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert") pod "ingress-canary-z4nr6" (UID: "5c18ff90-80b0-4fd9-a24d-ee6d39d729b7") : secret "canary-serving-cert" not found Apr 23 08:16:03.023864 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:03.023834 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b4f6z" Apr 23 08:16:03.042622 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:03.042565 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b4f6z" podStartSLOduration=98.529316791 podStartE2EDuration="1m41.042552264s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:15:28.924812493 +0000 UTC m=+66.723202940" lastFinishedPulling="2026-04-23 08:15:31.438047959 +0000 UTC m=+69.236438413" observedRunningTime="2026-04-23 08:15:32.034495836 +0000 UTC m=+69.832886306" watchObservedRunningTime="2026-04-23 08:16:03.042552264 +0000 UTC m=+100.840942730" Apr 23 08:16:27.164584 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.164546 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-578458b7fb-b9s92"] Apr 23 08:16:27.167346 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.167331 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.170057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.170030 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 08:16:27.170057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.170056 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 08:16:27.170308 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.170069 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 08:16:27.170308 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.170162 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-x9x2k\"" Apr 23 08:16:27.170461 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.170391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 08:16:27.170461 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.170413 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 08:16:27.170461 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.170433 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 08:16:27.178962 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.178935 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-578458b7fb-b9s92"] Apr 23 08:16:27.279153 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.279118 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.279373 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.279169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-default-certificate\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.279373 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.279259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.279373 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.279312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9hmb\" (UniqueName: \"kubernetes.io/projected/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-kube-api-access-m9hmb\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.279373 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.279347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-stats-auth\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.380098 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.380047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.380219 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.380121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-default-certificate\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.380219 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.380149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.380219 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.380170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9hmb\" (UniqueName: \"kubernetes.io/projected/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-kube-api-access-m9hmb\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.380219 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:27.380183 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:16:27.380394 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.380191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-stats-auth\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.380809 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:27.380768 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:27.880735138 +0000 UTC m=+125.679125600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : configmap references non-existent config key: service-ca.crt Apr 23 08:16:27.380952 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:27.380833 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:27.88081067 +0000 UTC m=+125.679201114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : secret "router-metrics-certs-default" not found Apr 23 08:16:27.384566 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.383616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-default-certificate\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.384566 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.383699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-stats-auth\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.389552 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.389525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9hmb\" (UniqueName: \"kubernetes.io/projected/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-kube-api-access-m9hmb\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.885608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.885550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.885608 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:27.885627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:27.885835 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:27.885721 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:16:27.885835 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:27.885768 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:28.885755017 +0000 UTC m=+126.684145462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : configmap references non-existent config key: service-ca.crt Apr 23 08:16:27.885835 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:27.885784 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:28.885777085 +0000 UTC m=+126.684167529 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : secret "router-metrics-certs-default" not found Apr 23 08:16:28.894010 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:28.893959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:28.894419 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:28.894122 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:16:28.894419 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:28.894136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:28.894419 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:28.894192 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:30.894172101 +0000 UTC m=+128.692562547 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : secret "router-metrics-certs-default" not found Apr 23 08:16:28.894419 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:28.894250 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:30.894239232 +0000 UTC m=+128.692629689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : configmap references non-existent config key: service-ca.crt Apr 23 08:16:30.909466 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:30.909404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:30.909962 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:30.909513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:30.909962 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:30.909562 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:34.909544604 +0000 UTC m=+132.707935050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : configmap references non-existent config key: service-ca.crt Apr 23 08:16:30.909962 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:30.909693 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:16:30.909962 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:30.909774 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:34.909755262 +0000 UTC m=+132.708145725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : secret "router-metrics-certs-default" not found Apr 23 08:16:32.521693 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:32.521656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:16:32.522169 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:32.521821 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:16:32.522169 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:32.521915 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs podName:75af54d6-d4ae-4e8e-bf63-80cc7a54fe63 nodeName:}" failed. No retries permitted until 2026-04-23 08:18:34.521893652 +0000 UTC m=+252.320284098 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs") pod "network-metrics-daemon-gtsb8" (UID: "75af54d6-d4ae-4e8e-bf63-80cc7a54fe63") : secret "metrics-daemon-secret" not found Apr 23 08:16:32.968902 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:32.968871 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sx7f6_f798ddd1-abe6-46fe-8f87-51eb8f211cba/dns-node-resolver/0.log" Apr 23 08:16:34.169673 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:34.169644 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ht997_ccf4efd3-2881-48aa-80c7-44d62f243db8/node-ca/0.log" Apr 23 08:16:34.936078 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:34.936020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:34.936278 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:34.936101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:34.936278 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:34.936174 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:16:34.936278 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:34.936233 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:42.936216879 +0000 UTC m=+140.734607324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : secret "router-metrics-certs-default" not found Apr 23 08:16:34.936278 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:34.936246 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:42.936240346 +0000 UTC m=+140.734630791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : configmap references non-existent config key: service-ca.crt Apr 23 08:16:42.991325 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:42.991274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:42.991686 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:42.991374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:42.991686 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:42.991416 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:16:42.991686 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:42.991481 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:58.99146106 +0000 UTC m=+156.789851509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : secret "router-metrics-certs-default" not found Apr 23 08:16:42.991686 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:42.991500 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle podName:fc24cfd5-bd05-46ce-9716-b6dcf55769b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:58.99148984 +0000 UTC m=+156.789880299 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle") pod "router-default-578458b7fb-b9s92" (UID: "fc24cfd5-bd05-46ce-9716-b6dcf55769b2") : configmap references non-existent config key: service-ca.crt Apr 23 08:16:44.423401 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.423366 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v6phf"] Apr 23 08:16:44.426397 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.426376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.429015 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.428993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:16:44.429123 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.428993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:16:44.429123 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.428993 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:16:44.430152 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.430134 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-npvv9\"" Apr 23 08:16:44.430251 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.430140 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:16:44.437130 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.437110 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v6phf"] Apr 23 08:16:44.504659 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.504617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-data-volume\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.504867 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.504675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfnm7\" (UniqueName: \"kubernetes.io/projected/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-kube-api-access-wfnm7\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.504867 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.504804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.504982 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.504870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-crio-socket\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.505031 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.505003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.606344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.606265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.606525 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.606374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-data-volume\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.606525 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.606398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfnm7\" (UniqueName: \"kubernetes.io/projected/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-kube-api-access-wfnm7\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.606525 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.606426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.606525 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.606463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-crio-socket\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.606667 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.606578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-crio-socket\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.606667 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:44.606581 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:44.606728 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:44.606695 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls podName:7a73e05a-b51d-4618-8f88-bc4fcbe4fd54 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:45.106675795 +0000 UTC m=+142.905066243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-v6phf" (UID: "7a73e05a-b51d-4618-8f88-bc4fcbe4fd54") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:44.606728 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.606721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-data-volume\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.606909 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.606889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:44.615613 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:44.615592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfnm7\" (UniqueName: \"kubernetes.io/projected/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-kube-api-access-wfnm7\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:45.110677 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:45.110627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:45.110891 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:45.110797 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:45.110891 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:45.110875 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls podName:7a73e05a-b51d-4618-8f88-bc4fcbe4fd54 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:46.110856561 +0000 UTC m=+143.909247021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-v6phf" (UID: "7a73e05a-b51d-4618-8f88-bc4fcbe4fd54") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:46.118939 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:46.118909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:46.119326 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:46.119018 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:46.119326 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:46.119071 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls podName:7a73e05a-b51d-4618-8f88-bc4fcbe4fd54 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:48.119057032 +0000 UTC m=+145.917447476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-v6phf" (UID: "7a73e05a-b51d-4618-8f88-bc4fcbe4fd54") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:48.133542 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:48.133492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:48.133944 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:48.133644 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:48.133944 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:16:48.133713 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls podName:7a73e05a-b51d-4618-8f88-bc4fcbe4fd54 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:52.133697402 +0000 UTC m=+149.932087847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-v6phf" (UID: "7a73e05a-b51d-4618-8f88-bc4fcbe4fd54") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:52.164724 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:52.164673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:52.167084 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:52.167049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a73e05a-b51d-4618-8f88-bc4fcbe4fd54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v6phf\" (UID: \"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54\") " pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:52.235429 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:52.235392 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v6phf" Apr 23 08:16:52.352944 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:52.352915 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v6phf"] Apr 23 08:16:52.356620 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:16:52.356582 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a73e05a_b51d_4618_8f88_bc4fcbe4fd54.slice/crio-01e0d5775081fdf6c9b22c40291bdcb64551ed5021dab077bf80f91ee0b84399 WatchSource:0}: Error finding container 01e0d5775081fdf6c9b22c40291bdcb64551ed5021dab077bf80f91ee0b84399: Status 404 returned error can't find the container with id 01e0d5775081fdf6c9b22c40291bdcb64551ed5021dab077bf80f91ee0b84399 Apr 23 08:16:53.170195 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:53.170160 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v6phf" event={"ID":"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54","Type":"ContainerStarted","Data":"aeaf718257bbb6e081060bb58b7e0480f330876f4b06e3c6ad686b1ef7d1e10f"} Apr 23 08:16:53.170195 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:53.170195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v6phf" event={"ID":"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54","Type":"ContainerStarted","Data":"01e0d5775081fdf6c9b22c40291bdcb64551ed5021dab077bf80f91ee0b84399"} Apr 23 08:16:54.174207 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:54.174174 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v6phf" event={"ID":"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54","Type":"ContainerStarted","Data":"3060c91ef7d5e01ce18eb4856fc33fd1777d7db842a67d2512a4acd255754e56"} Apr 23 08:16:55.180731 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:55.180640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v6phf" event={"ID":"7a73e05a-b51d-4618-8f88-bc4fcbe4fd54","Type":"ContainerStarted","Data":"074a85db90307ff72e7612cd5d10b04f06e77d014a68accc804b51ed6869dbf2"} Apr 23 08:16:55.200707 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:55.200653 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v6phf" podStartSLOduration=8.748212182 podStartE2EDuration="11.200638088s" podCreationTimestamp="2026-04-23 08:16:44 +0000 UTC" firstStartedPulling="2026-04-23 08:16:52.415748191 +0000 UTC m=+150.214138637" lastFinishedPulling="2026-04-23 08:16:54.868174085 +0000 UTC m=+152.666564543" observedRunningTime="2026-04-23 08:16:55.199887259 +0000 UTC m=+152.998277726" watchObservedRunningTime="2026-04-23 08:16:55.200638088 +0000 UTC m=+152.999028554" Apr 23 08:16:59.015547 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:59.015506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:59.016038 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:59.015572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:59.016240 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:59.016221 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-service-ca-bundle\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:59.017925 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:59.017898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc24cfd5-bd05-46ce-9716-b6dcf55769b2-metrics-certs\") pod \"router-default-578458b7fb-b9s92\" (UID: \"fc24cfd5-bd05-46ce-9716-b6dcf55769b2\") " pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:59.275470 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:59.275385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:16:59.392570 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:16:59.392536 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-578458b7fb-b9s92"] Apr 23 08:16:59.395797 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:16:59.395771 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc24cfd5_bd05_46ce_9716_b6dcf55769b2.slice/crio-5fdde069f3dce79596fe38b0b5679f8af6693cd4e71d170ce9f77d9a1ed26f8d WatchSource:0}: Error finding container 5fdde069f3dce79596fe38b0b5679f8af6693cd4e71d170ce9f77d9a1ed26f8d: Status 404 returned error can't find the container with id 5fdde069f3dce79596fe38b0b5679f8af6693cd4e71d170ce9f77d9a1ed26f8d Apr 23 08:17:00.145695 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:00.145629 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-55b55655f8-zwssm" podUID="cf15dd93-dbaa-4c02-82cd-5156525a67a9" Apr 23 08:17:00.158265 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:00.158238 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5r722" podUID="8a62b026-adde-4674-a052-cc9aa72e0a2a" Apr 23 08:17:00.180444 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:00.180414 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-z4nr6" podUID="5c18ff90-80b0-4fd9-a24d-ee6d39d729b7" Apr 23 08:17:00.192018 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:00.191984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5r722" Apr 23 08:17:00.192018 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:00.192000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-578458b7fb-b9s92" event={"ID":"fc24cfd5-bd05-46ce-9716-b6dcf55769b2","Type":"ContainerStarted","Data":"d1f74378f1763195da8649dfa0c4e39131e33068cf0643a0fc1b4cf1ca53c349"} Apr 23 08:17:00.192208 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:00.192037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-578458b7fb-b9s92" event={"ID":"fc24cfd5-bd05-46ce-9716-b6dcf55769b2","Type":"ContainerStarted","Data":"5fdde069f3dce79596fe38b0b5679f8af6693cd4e71d170ce9f77d9a1ed26f8d"} Apr 23 08:17:00.192208 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:00.192196 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:17:00.211683 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:00.211633 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-578458b7fb-b9s92" podStartSLOduration=33.211615848 podStartE2EDuration="33.211615848s" podCreationTimestamp="2026-04-23 08:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:17:00.211347757 +0000 UTC m=+158.009738224" watchObservedRunningTime="2026-04-23 08:17:00.211615848 +0000 UTC m=+158.010006309" Apr 23 08:17:00.275656 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:00.275613 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:17:00.278130 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:00.278104 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:17:01.194957 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:01.194921 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:17:01.196153 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:01.196136 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-578458b7fb-b9s92" Apr 23 08:17:01.788878 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:01.788817 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gtsb8" podUID="75af54d6-d4ae-4e8e-bf63-80cc7a54fe63" Apr 23 08:17:05.065723 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.065691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:17:05.066183 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.066157 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55b55655f8-zwssm"] Apr 23 08:17:05.066426 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:05.066401 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-55b55655f8-zwssm" podUID="cf15dd93-dbaa-4c02-82cd-5156525a67a9" Apr 23 08:17:05.068032 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.068013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"image-registry-55b55655f8-zwssm\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:17:05.091847 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.091825 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-n9mmn"] Apr 23 08:17:05.095848 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.095831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-n9mmn" Apr 23 08:17:05.098422 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.098400 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:17:05.098422 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.098400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-dzxqt\"" Apr 23 08:17:05.098581 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.098539 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:17:05.120779 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.120757 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-n9mmn"] Apr 23 08:17:05.166360 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.166333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:17:05.166360 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.166362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:17:05.166506 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.166392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fmm2\" (UniqueName: \"kubernetes.io/projected/ba7b4e72-c90c-416a-9437-1f377ecf8e36-kube-api-access-6fmm2\") pod \"downloads-6bcc868b7-n9mmn\" (UID: \"ba7b4e72-c90c-416a-9437-1f377ecf8e36\") " pod="openshift-console/downloads-6bcc868b7-n9mmn" Apr 23 08:17:05.168544 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.168523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a62b026-adde-4674-a052-cc9aa72e0a2a-metrics-tls\") pod \"dns-default-5r722\" (UID: \"8a62b026-adde-4674-a052-cc9aa72e0a2a\") " pod="openshift-dns/dns-default-5r722" Apr 23 08:17:05.168658 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.168592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c18ff90-80b0-4fd9-a24d-ee6d39d729b7-cert\") pod \"ingress-canary-z4nr6\" (UID: \"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7\") " pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:17:05.193615 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.193589 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c5f794bbc-jk7t7"] Apr 23 08:17:05.196610 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.196596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.205827 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.205799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:17:05.206888 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.206869 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c5f794bbc-jk7t7"] Apr 23 08:17:05.209941 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.209926 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:17:05.267046 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267017 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-installation-pull-secrets\") pod \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " Apr 23 08:17:05.267046 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267049 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") pod \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " Apr 23 08:17:05.267227 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267066 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-bound-sa-token\") pod \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " Apr 23 08:17:05.267227 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267093 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf15dd93-dbaa-4c02-82cd-5156525a67a9-ca-trust-extracted\") pod \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " Apr 23 08:17:05.267227 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267149 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-trusted-ca\") pod \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " Apr 23 08:17:05.267227 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267173 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-certificates\") pod \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " Apr 23 08:17:05.267227 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267199 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-image-registry-private-configuration\") pod \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " Apr 23 08:17:05.267489 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267232 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbg67\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-kube-api-access-dbg67\") pod \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\" (UID: \"cf15dd93-dbaa-4c02-82cd-5156525a67a9\") " Apr 23 08:17:05.267489 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-ca-trust-extracted\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.267489 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-installation-pull-secrets\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.267489 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fmm2\" (UniqueName: \"kubernetes.io/projected/ba7b4e72-c90c-416a-9437-1f377ecf8e36-kube-api-access-6fmm2\") pod \"downloads-6bcc868b7-n9mmn\" (UID: \"ba7b4e72-c90c-416a-9437-1f377ecf8e36\") " pod="openshift-console/downloads-6bcc868b7-n9mmn" Apr 23 08:17:05.267489 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-registry-tls\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.267489 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-trusted-ca\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.267773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267481 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf15dd93-dbaa-4c02-82cd-5156525a67a9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cf15dd93-dbaa-4c02-82cd-5156525a67a9" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:17:05.267773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzsds\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-kube-api-access-bzsds\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.267773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-image-registry-private-configuration\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.267773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-bound-sa-token\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.267773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267625 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cf15dd93-dbaa-4c02-82cd-5156525a67a9" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:17:05.267773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267634 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cf15dd93-dbaa-4c02-82cd-5156525a67a9" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:17:05.267773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-registry-certificates\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.268108 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267807 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-trusted-ca\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:05.268108 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267828 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-certificates\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:05.268108 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.267843 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf15dd93-dbaa-4c02-82cd-5156525a67a9-ca-trust-extracted\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:05.269446 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.269416 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cf15dd93-dbaa-4c02-82cd-5156525a67a9" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:17:05.269698 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.269678 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cf15dd93-dbaa-4c02-82cd-5156525a67a9" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:17:05.269698 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.269685 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cf15dd93-dbaa-4c02-82cd-5156525a67a9" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:17:05.269869 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.269844 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cf15dd93-dbaa-4c02-82cd-5156525a67a9" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:17:05.269915 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.269900 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-kube-api-access-dbg67" (OuterVolumeSpecName: "kube-api-access-dbg67") pod "cf15dd93-dbaa-4c02-82cd-5156525a67a9" (UID: "cf15dd93-dbaa-4c02-82cd-5156525a67a9"). InnerVolumeSpecName "kube-api-access-dbg67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:17:05.286716 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.286688 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fmm2\" (UniqueName: \"kubernetes.io/projected/ba7b4e72-c90c-416a-9437-1f377ecf8e36-kube-api-access-6fmm2\") pod \"downloads-6bcc868b7-n9mmn\" (UID: \"ba7b4e72-c90c-416a-9437-1f377ecf8e36\") " pod="openshift-console/downloads-6bcc868b7-n9mmn" Apr 23 08:17:05.295734 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.295715 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f2cmm\"" Apr 23 08:17:05.303692 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.303654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5r722" Apr 23 08:17:05.368740 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.368706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-ca-trust-extracted\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.368893 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.368751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-installation-pull-secrets\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.368893 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.368794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-registry-tls\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.368893 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.368825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-trusted-ca\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.368893 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.368853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzsds\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-kube-api-access-bzsds\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-image-registry-private-configuration\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-bound-sa-token\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-registry-certificates\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369346 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-registry-tls\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369363 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-image-registry-private-configuration\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369379 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-bound-sa-token\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369392 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dbg67\" (UniqueName: \"kubernetes.io/projected/cf15dd93-dbaa-4c02-82cd-5156525a67a9-kube-api-access-dbg67\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369407 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf15dd93-dbaa-4c02-82cd-5156525a67a9-installation-pull-secrets\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:05.370057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.369780 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-ca-trust-extracted\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.370657 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.370513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-registry-certificates\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.371108 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.371071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-trusted-ca\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.371693 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.371657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-installation-pull-secrets\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.372876 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.372861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-image-registry-private-configuration\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.373160 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.373131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-registry-tls\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.380204 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.379801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-bound-sa-token\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.380204 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.379887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzsds\" (UniqueName: \"kubernetes.io/projected/51bf1211-8340-4a64-97b4-f7f8f0e8eb17-kube-api-access-bzsds\") pod \"image-registry-6c5f794bbc-jk7t7\" (UID: \"51bf1211-8340-4a64-97b4-f7f8f0e8eb17\") " pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.405674 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.405640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-n9mmn" Apr 23 08:17:05.424926 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.424895 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5r722"] Apr 23 08:17:05.428381 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:05.428352 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a62b026_adde_4674_a052_cc9aa72e0a2a.slice/crio-af36eae5e481f742166abf4b3c7363b816e69d0631662afa0c4985e8b0d48896 WatchSource:0}: Error finding container af36eae5e481f742166abf4b3c7363b816e69d0631662afa0c4985e8b0d48896: Status 404 returned error can't find the container with id af36eae5e481f742166abf4b3c7363b816e69d0631662afa0c4985e8b0d48896 Apr 23 08:17:05.507286 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.507260 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fw7pg\"" Apr 23 08:17:05.515556 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.515529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:05.522053 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.522027 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-n9mmn"] Apr 23 08:17:05.525120 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:05.525093 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba7b4e72_c90c_416a_9437_1f377ecf8e36.slice/crio-c536f0dbd45025817695e3941df687e27b808ff79dddaa0b789644324ecbe7ea WatchSource:0}: Error finding container c536f0dbd45025817695e3941df687e27b808ff79dddaa0b789644324ecbe7ea: Status 404 returned error can't find the container with id c536f0dbd45025817695e3941df687e27b808ff79dddaa0b789644324ecbe7ea Apr 23 08:17:05.650267 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:05.650235 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c5f794bbc-jk7t7"] Apr 23 08:17:05.653477 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:05.653449 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51bf1211_8340_4a64_97b4_f7f8f0e8eb17.slice/crio-e2e9f9b91446ca308bd6ae1a5a9b39a845f81a71815eab302712757330fa9c3e WatchSource:0}: Error finding container e2e9f9b91446ca308bd6ae1a5a9b39a845f81a71815eab302712757330fa9c3e: Status 404 returned error can't find the container with id e2e9f9b91446ca308bd6ae1a5a9b39a845f81a71815eab302712757330fa9c3e Apr 23 08:17:06.210101 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.210052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5r722" event={"ID":"8a62b026-adde-4674-a052-cc9aa72e0a2a","Type":"ContainerStarted","Data":"af36eae5e481f742166abf4b3c7363b816e69d0631662afa0c4985e8b0d48896"} Apr 23 08:17:06.211371 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.211328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-n9mmn" event={"ID":"ba7b4e72-c90c-416a-9437-1f377ecf8e36","Type":"ContainerStarted","Data":"c536f0dbd45025817695e3941df687e27b808ff79dddaa0b789644324ecbe7ea"} Apr 23 08:17:06.212913 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.212884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" event={"ID":"51bf1211-8340-4a64-97b4-f7f8f0e8eb17","Type":"ContainerStarted","Data":"9e3b3b5a1c4f229d671cc35e8723ce54eb392887a654b8ec9d9ebab7d2ea6a7e"} Apr 23 08:17:06.212913 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.212914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" event={"ID":"51bf1211-8340-4a64-97b4-f7f8f0e8eb17","Type":"ContainerStarted","Data":"e2e9f9b91446ca308bd6ae1a5a9b39a845f81a71815eab302712757330fa9c3e"} Apr 23 08:17:06.213099 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.212933 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55b55655f8-zwssm" Apr 23 08:17:06.213099 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.213049 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:06.232462 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.232409 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" podStartSLOduration=1.232389545 podStartE2EDuration="1.232389545s" podCreationTimestamp="2026-04-23 08:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:17:06.232012568 +0000 UTC m=+164.030403035" watchObservedRunningTime="2026-04-23 08:17:06.232389545 +0000 UTC m=+164.030780014" Apr 23 08:17:06.268223 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.268187 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55b55655f8-zwssm"] Apr 23 08:17:06.271698 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.271672 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55b55655f8-zwssm"] Apr 23 08:17:06.777444 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:06.777327 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf15dd93-dbaa-4c02-82cd-5156525a67a9" path="/var/lib/kubelet/pods/cf15dd93-dbaa-4c02-82cd-5156525a67a9/volumes" Apr 23 08:17:07.217464 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:07.217379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5r722" event={"ID":"8a62b026-adde-4674-a052-cc9aa72e0a2a","Type":"ContainerStarted","Data":"fe0b095c1451ed584ca5b307860388101c0fb5c88e38e3037ccd868045fb71fe"} Apr 23 08:17:07.217464 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:07.217429 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5r722" event={"ID":"8a62b026-adde-4674-a052-cc9aa72e0a2a","Type":"ContainerStarted","Data":"7759cc8a12b6d80b0b7fdf28f59757f86202a889be64ca8cf85bc1e7d4fb6f5f"} Apr 23 08:17:07.217944 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:07.217659 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5r722" Apr 23 08:17:07.234846 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:07.234790 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5r722" podStartSLOduration=128.733470327 podStartE2EDuration="2m10.234771877s" podCreationTimestamp="2026-04-23 08:14:57 +0000 UTC" firstStartedPulling="2026-04-23 08:17:05.430197788 +0000 UTC m=+163.228588236" lastFinishedPulling="2026-04-23 08:17:06.931499336 +0000 UTC m=+164.729889786" observedRunningTime="2026-04-23 08:17:07.233284738 +0000 UTC m=+165.031675229" watchObservedRunningTime="2026-04-23 08:17:07.234771877 +0000 UTC m=+165.033162345" Apr 23 08:17:08.725733 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.725692 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qn8zr"] Apr 23 08:17:08.728795 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.728773 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.733008 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.732975 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-bjhxt\"" Apr 23 08:17:08.733157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.732979 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 08:17:08.733157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.733038 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:17:08.733157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.733045 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:17:08.733157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.733054 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:17:08.733157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.732985 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 08:17:08.737481 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.737449 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qn8zr"] Apr 23 08:17:08.797582 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.797550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhf7b\" (UniqueName: \"kubernetes.io/projected/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-kube-api-access-lhf7b\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.797582 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.797592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.797885 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.797688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.797885 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.797752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.898561 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.898512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.898561 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.898561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.898819 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.898632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhf7b\" (UniqueName: \"kubernetes.io/projected/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-kube-api-access-lhf7b\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.898819 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.898665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.898819 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:08.898692 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 08:17:08.898819 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:08.898780 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-tls podName:8379a0e9-e80c-4b7e-85e6-a0bfdf88e222 nodeName:}" failed. No retries permitted until 2026-04-23 08:17:09.398756889 +0000 UTC m=+167.197147349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-qn8zr" (UID: "8379a0e9-e80c-4b7e-85e6-a0bfdf88e222") : secret "prometheus-operator-tls" not found Apr 23 08:17:08.899718 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.899690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.901499 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.901473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:08.909597 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:08.909572 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhf7b\" (UniqueName: \"kubernetes.io/projected/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-kube-api-access-lhf7b\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:09.402504 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:09.402406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:09.405802 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:09.405756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8379a0e9-e80c-4b7e-85e6-a0bfdf88e222-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qn8zr\" (UID: \"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:09.640370 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:09.640328 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" Apr 23 08:17:09.769036 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:09.769001 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qn8zr"] Apr 23 08:17:09.772950 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:09.772919 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8379a0e9_e80c_4b7e_85e6_a0bfdf88e222.slice/crio-09c163d4e756ea033d957fc06c30a49a71d2dc38379dd8b604963039da3f9225 WatchSource:0}: Error finding container 09c163d4e756ea033d957fc06c30a49a71d2dc38379dd8b604963039da3f9225: Status 404 returned error can't find the container with id 09c163d4e756ea033d957fc06c30a49a71d2dc38379dd8b604963039da3f9225 Apr 23 08:17:10.228061 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:10.228020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" event={"ID":"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222","Type":"ContainerStarted","Data":"09c163d4e756ea033d957fc06c30a49a71d2dc38379dd8b604963039da3f9225"} Apr 23 08:17:11.232078 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:11.232029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" event={"ID":"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222","Type":"ContainerStarted","Data":"142403244c4d4ee726963862b4e96a1cfa493875b789ab3acc8076cff1615649"} Apr 23 08:17:12.236391 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.236348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" event={"ID":"8379a0e9-e80c-4b7e-85e6-a0bfdf88e222","Type":"ContainerStarted","Data":"e82a1329a2641e42ae783c838b01e3cb9d1da03786e10b28d68621eb20db91c4"} Apr 23 08:17:12.253863 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.253813 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-qn8zr" podStartSLOduration=2.925854413 podStartE2EDuration="4.253797584s" podCreationTimestamp="2026-04-23 08:17:08 +0000 UTC" firstStartedPulling="2026-04-23 08:17:09.775121533 +0000 UTC m=+167.573511992" lastFinishedPulling="2026-04-23 08:17:11.103064717 +0000 UTC m=+168.901455163" observedRunningTime="2026-04-23 08:17:12.252992396 +0000 UTC m=+170.051382865" watchObservedRunningTime="2026-04-23 08:17:12.253797584 +0000 UTC m=+170.052188070" Apr 23 08:17:12.774994 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.774952 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:17:12.828881 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.828844 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d4494dd7c-hrt7t"] Apr 23 08:17:12.832069 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.832041 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:12.834855 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.834793 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:17:12.834855 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.834807 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:17:12.835075 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.835058 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:17:12.835160 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.835127 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:17:12.835242 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.835066 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:17:12.835428 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.835349 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xd4rh\"" Apr 23 08:17:12.841867 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.841844 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d4494dd7c-hrt7t"] Apr 23 08:17:12.935771 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.935735 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-oauth-config\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:12.935977 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.935788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-oauth-serving-cert\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:12.935977 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.935901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-config\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:12.935977 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.935944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp4jb\" (UniqueName: \"kubernetes.io/projected/95bd7a8f-da39-4894-8a21-8bcf1a410653-kube-api-access-dp4jb\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:12.935977 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.935975 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-service-ca\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:12.936206 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:12.936003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-serving-cert\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.036406 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.036323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-oauth-serving-cert\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.036406 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.036400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-config\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.036642 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.036425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dp4jb\" (UniqueName: \"kubernetes.io/projected/95bd7a8f-da39-4894-8a21-8bcf1a410653-kube-api-access-dp4jb\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.036642 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.036442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-service-ca\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.036642 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.036466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-serving-cert\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.036642 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.036506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-oauth-config\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.037103 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.037075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-oauth-serving-cert\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.037601 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.037578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-config\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.038158 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.038136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-service-ca\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.039418 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.039395 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-oauth-config\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.039581 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.039537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-serving-cert\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.045323 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.045282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp4jb\" (UniqueName: \"kubernetes.io/projected/95bd7a8f-da39-4894-8a21-8bcf1a410653-kube-api-access-dp4jb\") pod \"console-5d4494dd7c-hrt7t\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.143699 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.143661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:13.275442 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.275406 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d4494dd7c-hrt7t"] Apr 23 08:17:13.278500 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:13.278471 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95bd7a8f_da39_4894_8a21_8bcf1a410653.slice/crio-f61617c45384912860b1dae3ec63bcfaa1e8ddadf352a200641821584b7f0585 WatchSource:0}: Error finding container f61617c45384912860b1dae3ec63bcfaa1e8ddadf352a200641821584b7f0585: Status 404 returned error can't find the container with id f61617c45384912860b1dae3ec63bcfaa1e8ddadf352a200641821584b7f0585 Apr 23 08:17:13.773598 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.773564 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:17:13.776197 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.776171 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkwh7\"" Apr 23 08:17:13.784453 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.784424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z4nr6" Apr 23 08:17:13.920242 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:13.920209 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z4nr6"] Apr 23 08:17:13.923626 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:13.923597 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c18ff90_80b0_4fd9_a24d_ee6d39d729b7.slice/crio-ef800a255117898b7457e73856ba861e489c626eca5eaed4c727064598150812 WatchSource:0}: Error finding container ef800a255117898b7457e73856ba861e489c626eca5eaed4c727064598150812: Status 404 returned error can't find the container with id ef800a255117898b7457e73856ba861e489c626eca5eaed4c727064598150812 Apr 23 08:17:14.067584 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.067507 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr"] Apr 23 08:17:14.071860 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.071837 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.074580 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.074551 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 08:17:14.074712 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.074598 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 08:17:14.074712 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.074556 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-74zxn\"" Apr 23 08:17:14.083953 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.083934 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr"] Apr 23 08:17:14.100806 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.100781 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wdpfr"] Apr 23 08:17:14.105198 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.105174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.113607 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.113439 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:17:14.113607 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.113451 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-jn6vp\"" Apr 23 08:17:14.113607 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.113492 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:17:14.113607 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.113572 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:17:14.119265 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.119245 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ncncp"] Apr 23 08:17:14.123268 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.123248 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.126556 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.126534 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 08:17:14.126682 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.126539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 08:17:14.126682 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.126580 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-gxr2x\"" Apr 23 08:17:14.126682 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.126541 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 08:17:14.136349 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.136328 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ncncp"] Apr 23 08:17:14.144743 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.144715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/923b78bc-6002-41ca-97fa-17f5c638dc18-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.144872 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.144761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86gw7\" (UniqueName: \"kubernetes.io/projected/923b78bc-6002-41ca-97fa-17f5c638dc18-kube-api-access-86gw7\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.144872 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.144816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/923b78bc-6002-41ca-97fa-17f5c638dc18-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.144991 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.144910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/923b78bc-6002-41ca-97fa-17f5c638dc18-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.245367 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.245567 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.245567 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.245567 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245539 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-tls\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.245772 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245572 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-accelerators-collector-config\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.245772 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-wtmp\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.245772 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/923b78bc-6002-41ca-97fa-17f5c638dc18-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.245772 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86gw7\" (UniqueName: \"kubernetes.io/projected/923b78bc-6002-41ca-97fa-17f5c638dc18-kube-api-access-86gw7\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.245772 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-textfile\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.245772 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.245772 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjqkv\" (UniqueName: \"kubernetes.io/projected/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-api-access-vjqkv\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.245772 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-root\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.246122 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5qd\" (UniqueName: \"kubernetes.io/projected/61cdcd9f-f094-40e3-9ea7-a17d4855004a-kube-api-access-zh5qd\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.246122 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/923b78bc-6002-41ca-97fa-17f5c638dc18-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.246122 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-sys\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.246122 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61cdcd9f-f094-40e3-9ea7-a17d4855004a-metrics-client-ca\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.246122 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/923b78bc-6002-41ca-97fa-17f5c638dc18-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.246122 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbb8a1c5-375f-477c-a34a-d075aa60da89-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.246122 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fbb8a1c5-375f-477c-a34a-d075aa60da89-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.246474 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.245856 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d4494dd7c-hrt7t" event={"ID":"95bd7a8f-da39-4894-8a21-8bcf1a410653","Type":"ContainerStarted","Data":"f61617c45384912860b1dae3ec63bcfaa1e8ddadf352a200641821584b7f0585"} Apr 23 08:17:14.246915 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.246889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/923b78bc-6002-41ca-97fa-17f5c638dc18-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.248235 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.248205 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z4nr6" event={"ID":"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7","Type":"ContainerStarted","Data":"ef800a255117898b7457e73856ba861e489c626eca5eaed4c727064598150812"} Apr 23 08:17:14.249434 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.249414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/923b78bc-6002-41ca-97fa-17f5c638dc18-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.249592 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.249571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/923b78bc-6002-41ca-97fa-17f5c638dc18-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.254875 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.254855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86gw7\" (UniqueName: \"kubernetes.io/projected/923b78bc-6002-41ca-97fa-17f5c638dc18-kube-api-access-86gw7\") pod \"openshift-state-metrics-9d44df66c-vgmkr\" (UID: \"923b78bc-6002-41ca-97fa-17f5c638dc18\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.346475 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-wtmp\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346475 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-textfile\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346475 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjqkv\" (UniqueName: \"kubernetes.io/projected/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-api-access-vjqkv\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-root\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5qd\" (UniqueName: \"kubernetes.io/projected/61cdcd9f-f094-40e3-9ea7-a17d4855004a-kube-api-access-zh5qd\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-wtmp\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-sys\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61cdcd9f-f094-40e3-9ea7-a17d4855004a-metrics-client-ca\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-root\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbb8a1c5-375f-477c-a34a-d075aa60da89-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61cdcd9f-f094-40e3-9ea7-a17d4855004a-sys\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fbb8a1c5-375f-477c-a34a-d075aa60da89-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.346906 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.346857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-textfile\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.347590 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.347032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-tls\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.347590 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.347072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-accelerators-collector-config\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.347590 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.347216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fbb8a1c5-375f-477c-a34a-d075aa60da89-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.347590 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.347371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbb8a1c5-375f-477c-a34a-d075aa60da89-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.347955 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.347932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61cdcd9f-f094-40e3-9ea7-a17d4855004a-metrics-client-ca\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.348066 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.347968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-accelerators-collector-config\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.348066 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.348004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.349875 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.349848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.350142 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.350096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.350731 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.350688 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.351048 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.351025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/61cdcd9f-f094-40e3-9ea7-a17d4855004a-node-exporter-tls\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.355693 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.355673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5qd\" (UniqueName: \"kubernetes.io/projected/61cdcd9f-f094-40e3-9ea7-a17d4855004a-kube-api-access-zh5qd\") pod \"node-exporter-wdpfr\" (UID: \"61cdcd9f-f094-40e3-9ea7-a17d4855004a\") " pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.356052 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.356033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjqkv\" (UniqueName: \"kubernetes.io/projected/fbb8a1c5-375f-477c-a34a-d075aa60da89-kube-api-access-vjqkv\") pod \"kube-state-metrics-69db897b98-ncncp\" (UID: \"fbb8a1c5-375f-477c-a34a-d075aa60da89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.384810 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.384783 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" Apr 23 08:17:14.417093 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.417039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wdpfr" Apr 23 08:17:14.427802 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:14.427664 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61cdcd9f_f094_40e3_9ea7_a17d4855004a.slice/crio-6316ae0a424ea88f4d69925a6d41d209a75a7c41b7111f20afc0b3c6f8be8db2 WatchSource:0}: Error finding container 6316ae0a424ea88f4d69925a6d41d209a75a7c41b7111f20afc0b3c6f8be8db2: Status 404 returned error can't find the container with id 6316ae0a424ea88f4d69925a6d41d209a75a7c41b7111f20afc0b3c6f8be8db2 Apr 23 08:17:14.436134 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.435592 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" Apr 23 08:17:14.553085 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.553052 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr"] Apr 23 08:17:14.559279 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:14.559238 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923b78bc_6002_41ca_97fa_17f5c638dc18.slice/crio-3a6fd1ab2c8781ef4b53fa54886deaecb17af4846246a64d7cb23fccec623f2a WatchSource:0}: Error finding container 3a6fd1ab2c8781ef4b53fa54886deaecb17af4846246a64d7cb23fccec623f2a: Status 404 returned error can't find the container with id 3a6fd1ab2c8781ef4b53fa54886deaecb17af4846246a64d7cb23fccec623f2a Apr 23 08:17:14.603947 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:14.603866 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ncncp"] Apr 23 08:17:14.608465 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:14.608425 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb8a1c5_375f_477c_a34a_d075aa60da89.slice/crio-0766f27cffab3107a833fd900addf860a66974d7f7dc193a28fefd27644056fc WatchSource:0}: Error finding container 0766f27cffab3107a833fd900addf860a66974d7f7dc193a28fefd27644056fc: Status 404 returned error can't find the container with id 0766f27cffab3107a833fd900addf860a66974d7f7dc193a28fefd27644056fc Apr 23 08:17:15.253727 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:15.253684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" event={"ID":"fbb8a1c5-375f-477c-a34a-d075aa60da89","Type":"ContainerStarted","Data":"0766f27cffab3107a833fd900addf860a66974d7f7dc193a28fefd27644056fc"} Apr 23 08:17:15.256440 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:15.256408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" event={"ID":"923b78bc-6002-41ca-97fa-17f5c638dc18","Type":"ContainerStarted","Data":"4439b38ab372d3f2cfd78102daeaa94c9225b6c46573c046cb83e778b4ffc05f"} Apr 23 08:17:15.256577 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:15.256458 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" event={"ID":"923b78bc-6002-41ca-97fa-17f5c638dc18","Type":"ContainerStarted","Data":"84430c86ab567d75f109aeb485ded4c88e4c82fb27f20e54007293b3015466c5"} Apr 23 08:17:15.256577 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:15.256473 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" event={"ID":"923b78bc-6002-41ca-97fa-17f5c638dc18","Type":"ContainerStarted","Data":"3a6fd1ab2c8781ef4b53fa54886deaecb17af4846246a64d7cb23fccec623f2a"} Apr 23 08:17:15.257857 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:15.257827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wdpfr" event={"ID":"61cdcd9f-f094-40e3-9ea7-a17d4855004a","Type":"ContainerStarted","Data":"6316ae0a424ea88f4d69925a6d41d209a75a7c41b7111f20afc0b3c6f8be8db2"} Apr 23 08:17:17.223363 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:17.223329 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5r722" Apr 23 08:17:18.585578 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.585542 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-989d867f7-8tp78"] Apr 23 08:17:18.588970 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.588936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.592868 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.592842 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 08:17:18.593185 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.593162 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 08:17:18.593352 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.593279 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-xsvjm\"" Apr 23 08:17:18.593455 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.593426 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 08:17:18.593577 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.593551 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2ef1jrq43ug5s\"" Apr 23 08:17:18.593696 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.593669 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 08:17:18.596632 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.596609 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-989d867f7-8tp78"] Apr 23 08:17:18.687153 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.687110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-secret-metrics-server-tls\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.687338 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.687178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-audit-log\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.687338 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.687202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-client-ca-bundle\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.687338 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.687228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-secret-metrics-server-client-certs\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.687474 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.687353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-metrics-server-audit-profiles\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.687474 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.687390 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs2vl\" (UniqueName: \"kubernetes.io/projected/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-kube-api-access-zs2vl\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.687474 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.687425 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.788319 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.788269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-metrics-server-audit-profiles\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.788485 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.788325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zs2vl\" (UniqueName: \"kubernetes.io/projected/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-kube-api-access-zs2vl\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.788485 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.788360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.788485 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.788445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-secret-metrics-server-tls\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.788654 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.788487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-audit-log\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.788654 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.788516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-client-ca-bundle\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.788654 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.788554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-secret-metrics-server-client-certs\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.788983 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.788954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-audit-log\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.789408 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.789360 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-metrics-server-audit-profiles\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.789600 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.789573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.791435 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.791409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-secret-metrics-server-client-certs\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.791546 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.791498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-secret-metrics-server-tls\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.791602 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.791565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-client-ca-bundle\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.800612 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.800588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs2vl\" (UniqueName: \"kubernetes.io/projected/0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74-kube-api-access-zs2vl\") pod \"metrics-server-989d867f7-8tp78\" (UID: \"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74\") " pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.855350 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.855259 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm"] Apr 23 08:17:18.860839 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.860817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" Apr 23 08:17:18.864580 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.864360 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-slg4l\"" Apr 23 08:17:18.864580 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.864376 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 08:17:18.867476 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.867436 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm"] Apr 23 08:17:18.903226 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.903193 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:18.989995 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:18.989955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3345d53f-22be-4035-8bde-e7cca402f09e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4pntm\" (UID: \"3345d53f-22be-4035-8bde-e7cca402f09e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" Apr 23 08:17:19.091183 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:19.091147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3345d53f-22be-4035-8bde-e7cca402f09e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4pntm\" (UID: \"3345d53f-22be-4035-8bde-e7cca402f09e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" Apr 23 08:17:19.094058 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:19.094029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3345d53f-22be-4035-8bde-e7cca402f09e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4pntm\" (UID: \"3345d53f-22be-4035-8bde-e7cca402f09e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" Apr 23 08:17:19.173027 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:19.172993 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" Apr 23 08:17:20.296927 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.296895 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:17:20.323453 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.323420 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:17:20.323660 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.323636 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.329092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.328753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 08:17:20.329092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.328787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 08:17:20.329092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.328890 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 08:17:20.329092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.328912 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-570e3aeg8dk6j\"" Apr 23 08:17:20.329092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.328947 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 08:17:20.329092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.328997 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 08:17:20.329092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.328753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 08:17:20.329528 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.329275 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 08:17:20.329528 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.329318 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 08:17:20.329920 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.329871 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 08:17:20.330049 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.330007 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 08:17:20.330718 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.330701 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n4jfb\"" Apr 23 08:17:20.334324 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.334084 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 08:17:20.341138 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.341112 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 08:17:20.402425 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402597 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402597 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402597 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402597 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402597 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402597 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402889 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxt8r\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-kube-api-access-kxt8r\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402889 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402889 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402889 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402889 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402889 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402889 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.402889 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-config\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.403254 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.403254 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.403254 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.402962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.503874 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.503834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504087 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.503889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504087 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.503919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504087 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.503949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504087 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxt8r\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-kube-api-access-kxt8r\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504087 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504087 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-config\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504332 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.504491 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.504421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.505436 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.505388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.506422 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.506393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.507738 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.507407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.509785 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.508405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.509785 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.508895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.509785 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.509014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.509785 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.509647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.509785 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.509709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-config\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.510509 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.510483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.512859 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.510533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.512859 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.510573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.512859 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.512632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.525871 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.525827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.525871 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.525829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.528023 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.528001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxt8r\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-kube-api-access-kxt8r\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.530754 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.530730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.532610 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.532570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.543265 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.542721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:20.644222 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:20.644190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:24.154409 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:24.154370 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-989d867f7-8tp78"] Apr 23 08:17:24.418172 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:24.418123 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b7ac758_0bd8_4ac9_9785_cfd4aa3a3e74.slice/crio-2a6056b5b21a062d2625d66c8ab4a220125768cc8b27b716f7fb0a422cbbede5 WatchSource:0}: Error finding container 2a6056b5b21a062d2625d66c8ab4a220125768cc8b27b716f7fb0a422cbbede5: Status 404 returned error can't find the container with id 2a6056b5b21a062d2625d66c8ab4a220125768cc8b27b716f7fb0a422cbbede5 Apr 23 08:17:24.553892 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:24.553851 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm"] Apr 23 08:17:24.559928 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:24.559883 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3345d53f_22be_4035_8bde_e7cca402f09e.slice/crio-150568b76a7fcb3e56fb2aa895f3df2d7046bc97b6899bbdcb04cc81a7c1287c WatchSource:0}: Error finding container 150568b76a7fcb3e56fb2aa895f3df2d7046bc97b6899bbdcb04cc81a7c1287c: Status 404 returned error can't find the container with id 150568b76a7fcb3e56fb2aa895f3df2d7046bc97b6899bbdcb04cc81a7c1287c Apr 23 08:17:24.596074 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:24.596023 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:17:24.600455 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:17:24.600402 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6d02624_cb56_48c2_b2d6_afa71c610d89.slice/crio-f4899c3a12a4d9890518988b20930f50b8306cca669ca8900de2b013a4459819 WatchSource:0}: Error finding container f4899c3a12a4d9890518988b20930f50b8306cca669ca8900de2b013a4459819: Status 404 returned error can't find the container with id f4899c3a12a4d9890518988b20930f50b8306cca669ca8900de2b013a4459819 Apr 23 08:17:25.303565 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.303523 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z4nr6" event={"ID":"5c18ff90-80b0-4fd9-a24d-ee6d39d729b7","Type":"ContainerStarted","Data":"0ce2d4b736efb0e95039d961dba9fb57d0037b355afe2e2b25f4aaaeea637563"} Apr 23 08:17:25.306801 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.306720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" event={"ID":"fbb8a1c5-375f-477c-a34a-d075aa60da89","Type":"ContainerStarted","Data":"f9356253b020b256e29cbf19997dc130aa168b023227d7b3842c110e4f2b2ce2"} Apr 23 08:17:25.306801 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.306755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" event={"ID":"fbb8a1c5-375f-477c-a34a-d075aa60da89","Type":"ContainerStarted","Data":"81267d5d0236e52df5c68698ccadca8ff91d56cfec290b76cdf574cfe0f76922"} Apr 23 08:17:25.306801 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.306779 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" event={"ID":"fbb8a1c5-375f-477c-a34a-d075aa60da89","Type":"ContainerStarted","Data":"8d69f944e54c4180d498f04587ff7010dad186ddf8b0bc09ea71bf5eb9564c6e"} Apr 23 08:17:25.308800 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.308725 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-n9mmn" event={"ID":"ba7b4e72-c90c-416a-9437-1f377ecf8e36","Type":"ContainerStarted","Data":"eb3fb2e7677c1c1560acd05e78fe83d4c7dfb27849a0529b3be32e0758c398e6"} Apr 23 08:17:25.309605 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.309585 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-n9mmn" Apr 23 08:17:25.311288 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.311246 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d4494dd7c-hrt7t" event={"ID":"95bd7a8f-da39-4894-8a21-8bcf1a410653","Type":"ContainerStarted","Data":"0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c"} Apr 23 08:17:25.316682 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.316650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" event={"ID":"923b78bc-6002-41ca-97fa-17f5c638dc18","Type":"ContainerStarted","Data":"7b0e49c58376aff05c6b06011cd2ff719b56e617682e08785cd5461cfa9d19b5"} Apr 23 08:17:25.320163 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.319879 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z4nr6" podStartSLOduration=137.764776847 podStartE2EDuration="2m28.319863227s" podCreationTimestamp="2026-04-23 08:14:57 +0000 UTC" firstStartedPulling="2026-04-23 08:17:13.925826134 +0000 UTC m=+171.724216586" lastFinishedPulling="2026-04-23 08:17:24.480912504 +0000 UTC m=+182.279302966" observedRunningTime="2026-04-23 08:17:25.318260597 +0000 UTC m=+183.116651067" watchObservedRunningTime="2026-04-23 08:17:25.319863227 +0000 UTC m=+183.118253696" Apr 23 08:17:25.320962 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.320792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" event={"ID":"3345d53f-22be-4035-8bde-e7cca402f09e","Type":"ContainerStarted","Data":"150568b76a7fcb3e56fb2aa895f3df2d7046bc97b6899bbdcb04cc81a7c1287c"} Apr 23 08:17:25.326126 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.321471 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-n9mmn" Apr 23 08:17:25.326126 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.323803 2575 generic.go:358] "Generic (PLEG): container finished" podID="61cdcd9f-f094-40e3-9ea7-a17d4855004a" containerID="41630e6def5bbee3d14658460333b0729c4779dbb682b27e69e552ce0d489064" exitCode=0 Apr 23 08:17:25.326126 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.323883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wdpfr" event={"ID":"61cdcd9f-f094-40e3-9ea7-a17d4855004a","Type":"ContainerDied","Data":"41630e6def5bbee3d14658460333b0729c4779dbb682b27e69e552ce0d489064"} Apr 23 08:17:25.333018 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.332820 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerStarted","Data":"f4899c3a12a4d9890518988b20930f50b8306cca669ca8900de2b013a4459819"} Apr 23 08:17:25.337541 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.337496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" event={"ID":"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74","Type":"ContainerStarted","Data":"2a6056b5b21a062d2625d66c8ab4a220125768cc8b27b716f7fb0a422cbbede5"} Apr 23 08:17:25.351032 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.349617 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d4494dd7c-hrt7t" podStartSLOduration=2.212229312 podStartE2EDuration="13.349600231s" podCreationTimestamp="2026-04-23 08:17:12 +0000 UTC" firstStartedPulling="2026-04-23 08:17:13.280791374 +0000 UTC m=+171.079181828" lastFinishedPulling="2026-04-23 08:17:24.418162286 +0000 UTC m=+182.216552747" observedRunningTime="2026-04-23 08:17:25.349278278 +0000 UTC m=+183.147668745" watchObservedRunningTime="2026-04-23 08:17:25.349600231 +0000 UTC m=+183.147990698" Apr 23 08:17:25.374037 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.373971 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgmkr" podStartSLOduration=1.607898646 podStartE2EDuration="11.373948482s" podCreationTimestamp="2026-04-23 08:17:14 +0000 UTC" firstStartedPulling="2026-04-23 08:17:14.723774179 +0000 UTC m=+172.522164638" lastFinishedPulling="2026-04-23 08:17:24.489824025 +0000 UTC m=+182.288214474" observedRunningTime="2026-04-23 08:17:25.371732762 +0000 UTC m=+183.170123230" watchObservedRunningTime="2026-04-23 08:17:25.373948482 +0000 UTC m=+183.172338951" Apr 23 08:17:25.419124 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.418900 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-ncncp" podStartSLOduration=1.5407538280000002 podStartE2EDuration="11.418879581s" podCreationTimestamp="2026-04-23 08:17:14 +0000 UTC" firstStartedPulling="2026-04-23 08:17:14.61121147 +0000 UTC m=+172.409601916" lastFinishedPulling="2026-04-23 08:17:24.489337219 +0000 UTC m=+182.287727669" observedRunningTime="2026-04-23 08:17:25.398660584 +0000 UTC m=+183.197051050" watchObservedRunningTime="2026-04-23 08:17:25.418879581 +0000 UTC m=+183.217270051" Apr 23 08:17:25.419688 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:25.419646 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-n9mmn" podStartSLOduration=1.465634361 podStartE2EDuration="20.419636375s" podCreationTimestamp="2026-04-23 08:17:05 +0000 UTC" firstStartedPulling="2026-04-23 08:17:05.526908937 +0000 UTC m=+163.325299382" lastFinishedPulling="2026-04-23 08:17:24.480910938 +0000 UTC m=+182.279301396" observedRunningTime="2026-04-23 08:17:25.417890522 +0000 UTC m=+183.216280992" watchObservedRunningTime="2026-04-23 08:17:25.419636375 +0000 UTC m=+183.218026843" Apr 23 08:17:26.345043 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:26.344991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wdpfr" event={"ID":"61cdcd9f-f094-40e3-9ea7-a17d4855004a","Type":"ContainerStarted","Data":"398907e459cc3ca80699769ed1010af9841e32a8ebf4b51b3a14146f81e7bdd4"} Apr 23 08:17:26.345043 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:26.345041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wdpfr" event={"ID":"61cdcd9f-f094-40e3-9ea7-a17d4855004a","Type":"ContainerStarted","Data":"bf2ffac2e7a603255c2b9a35c6a25bbbb1a73e008861f048a247fb28482e5961"} Apr 23 08:17:27.137329 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:27.137256 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wdpfr" podStartSLOduration=3.076786239 podStartE2EDuration="13.137237041s" podCreationTimestamp="2026-04-23 08:17:14 +0000 UTC" firstStartedPulling="2026-04-23 08:17:14.430818475 +0000 UTC m=+172.229208924" lastFinishedPulling="2026-04-23 08:17:24.491269269 +0000 UTC m=+182.289659726" observedRunningTime="2026-04-23 08:17:26.368223001 +0000 UTC m=+184.166613470" watchObservedRunningTime="2026-04-23 08:17:27.137237041 +0000 UTC m=+184.935627509" Apr 23 08:17:27.137970 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:27.137939 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d4494dd7c-hrt7t"] Apr 23 08:17:27.223533 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:27.223282 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c5f794bbc-jk7t7" Apr 23 08:17:28.352995 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:28.352949 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" event={"ID":"3345d53f-22be-4035-8bde-e7cca402f09e","Type":"ContainerStarted","Data":"55edeaf745f259e8af882c48c321dbdde1da20d9e43454e476181e61856fd148"} Apr 23 08:17:28.353487 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:28.353128 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" Apr 23 08:17:28.355135 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:28.355104 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" exitCode=0 Apr 23 08:17:28.355270 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:28.355140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerDied","Data":"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec"} Apr 23 08:17:28.357603 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:28.357102 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" event={"ID":"0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74","Type":"ContainerStarted","Data":"229e1e9319787f6c346fc03dbd4b2f98fd3680d34dcd283f3fc91d6a71753626"} Apr 23 08:17:28.360766 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:28.360741 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" Apr 23 08:17:28.368568 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:28.368516 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4pntm" podStartSLOduration=7.124601054 podStartE2EDuration="10.368500583s" podCreationTimestamp="2026-04-23 08:17:18 +0000 UTC" firstStartedPulling="2026-04-23 08:17:24.562678787 +0000 UTC m=+182.361069249" lastFinishedPulling="2026-04-23 08:17:27.806578326 +0000 UTC m=+185.604968778" observedRunningTime="2026-04-23 08:17:28.367930986 +0000 UTC m=+186.166321454" watchObservedRunningTime="2026-04-23 08:17:28.368500583 +0000 UTC m=+186.166891051" Apr 23 08:17:28.399576 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:28.399513 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" podStartSLOduration=7.042544402 podStartE2EDuration="10.399466673s" podCreationTimestamp="2026-04-23 08:17:18 +0000 UTC" firstStartedPulling="2026-04-23 08:17:24.452742704 +0000 UTC m=+182.251133163" lastFinishedPulling="2026-04-23 08:17:27.809664985 +0000 UTC m=+185.608055434" observedRunningTime="2026-04-23 08:17:28.398526267 +0000 UTC m=+186.196916747" watchObservedRunningTime="2026-04-23 08:17:28.399466673 +0000 UTC m=+186.197857137" Apr 23 08:17:28.496821 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:28.496537 2575 configmap.go:193] Couldn't get configMap openshift-monitoring/prometheus-k8s-rulefiles-0: configmap "prometheus-k8s-rulefiles-0" not found Apr 23 08:17:28.496821 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:28.496613 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-rulefiles-0 podName:f6d02624-cb56-48c2-b2d6-afa71c610d89 nodeName:}" failed. No retries permitted until 2026-04-23 08:17:28.996591226 +0000 UTC m=+186.794981682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-k8s-rulefiles-0" (UniqueName: "kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-rulefiles-0") pod "prometheus-k8s-0" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89") : configmap "prometheus-k8s-rulefiles-0" not found Apr 23 08:17:32.374100 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:32.374052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerStarted","Data":"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6"} Apr 23 08:17:33.144325 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:33.144236 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:33.381685 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:33.381633 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerStarted","Data":"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d"} Apr 23 08:17:35.392124 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:35.392040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerStarted","Data":"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d"} Apr 23 08:17:35.392124 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:35.392080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerStarted","Data":"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541"} Apr 23 08:17:35.392124 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:35.392090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerStarted","Data":"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b"} Apr 23 08:17:35.392124 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:35.392098 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerStarted","Data":"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6"} Apr 23 08:17:35.428604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:35.428545 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.155060558 podStartE2EDuration="15.428527423s" podCreationTimestamp="2026-04-23 08:17:20 +0000 UTC" firstStartedPulling="2026-04-23 08:17:24.602170594 +0000 UTC m=+182.400561052" lastFinishedPulling="2026-04-23 08:17:34.875637457 +0000 UTC m=+192.674027917" observedRunningTime="2026-04-23 08:17:35.425803028 +0000 UTC m=+193.224193537" watchObservedRunningTime="2026-04-23 08:17:35.428527423 +0000 UTC m=+193.226917889" Apr 23 08:17:35.645287 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:35.645208 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:38.903540 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:38.903502 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:38.903931 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:38.903550 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:52.369476 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.369412 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d4494dd7c-hrt7t" podUID="95bd7a8f-da39-4894-8a21-8bcf1a410653" containerName="console" containerID="cri-o://0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c" gracePeriod=15 Apr 23 08:17:52.603008 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.602983 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d4494dd7c-hrt7t_95bd7a8f-da39-4894-8a21-8bcf1a410653/console/0.log" Apr 23 08:17:52.603136 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.603046 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:52.715747 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.715660 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-serving-cert\") pod \"95bd7a8f-da39-4894-8a21-8bcf1a410653\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " Apr 23 08:17:52.715747 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.715704 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-service-ca\") pod \"95bd7a8f-da39-4894-8a21-8bcf1a410653\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " Apr 23 08:17:52.715932 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.715830 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-config\") pod \"95bd7a8f-da39-4894-8a21-8bcf1a410653\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " Apr 23 08:17:52.715932 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.715874 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-oauth-serving-cert\") pod \"95bd7a8f-da39-4894-8a21-8bcf1a410653\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " Apr 23 08:17:52.715932 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.715904 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp4jb\" (UniqueName: \"kubernetes.io/projected/95bd7a8f-da39-4894-8a21-8bcf1a410653-kube-api-access-dp4jb\") pod \"95bd7a8f-da39-4894-8a21-8bcf1a410653\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " Apr 23 08:17:52.715932 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.715927 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-oauth-config\") pod \"95bd7a8f-da39-4894-8a21-8bcf1a410653\" (UID: \"95bd7a8f-da39-4894-8a21-8bcf1a410653\") " Apr 23 08:17:52.716204 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.716179 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-service-ca" (OuterVolumeSpecName: "service-ca") pod "95bd7a8f-da39-4894-8a21-8bcf1a410653" (UID: "95bd7a8f-da39-4894-8a21-8bcf1a410653"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:17:52.716348 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.716216 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-config" (OuterVolumeSpecName: "console-config") pod "95bd7a8f-da39-4894-8a21-8bcf1a410653" (UID: "95bd7a8f-da39-4894-8a21-8bcf1a410653"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:17:52.716348 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.716327 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "95bd7a8f-da39-4894-8a21-8bcf1a410653" (UID: "95bd7a8f-da39-4894-8a21-8bcf1a410653"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:17:52.717984 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.717955 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "95bd7a8f-da39-4894-8a21-8bcf1a410653" (UID: "95bd7a8f-da39-4894-8a21-8bcf1a410653"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:17:52.718078 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.718031 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "95bd7a8f-da39-4894-8a21-8bcf1a410653" (UID: "95bd7a8f-da39-4894-8a21-8bcf1a410653"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:17:52.718130 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.718111 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95bd7a8f-da39-4894-8a21-8bcf1a410653-kube-api-access-dp4jb" (OuterVolumeSpecName: "kube-api-access-dp4jb") pod "95bd7a8f-da39-4894-8a21-8bcf1a410653" (UID: "95bd7a8f-da39-4894-8a21-8bcf1a410653"). InnerVolumeSpecName "kube-api-access-dp4jb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:17:52.816978 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.816929 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-config\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.816978 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.816969 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-oauth-serving-cert\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.816978 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.816980 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dp4jb\" (UniqueName: \"kubernetes.io/projected/95bd7a8f-da39-4894-8a21-8bcf1a410653-kube-api-access-dp4jb\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.816978 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.816991 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-oauth-config\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.816978 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.817000 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95bd7a8f-da39-4894-8a21-8bcf1a410653-console-serving-cert\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.817286 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:52.817008 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95bd7a8f-da39-4894-8a21-8bcf1a410653-service-ca\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:17:53.445343 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.445312 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d4494dd7c-hrt7t_95bd7a8f-da39-4894-8a21-8bcf1a410653/console/0.log" Apr 23 08:17:53.445823 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.445348 2575 generic.go:358] "Generic (PLEG): container finished" podID="95bd7a8f-da39-4894-8a21-8bcf1a410653" containerID="0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c" exitCode=2 Apr 23 08:17:53.445823 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.445432 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d4494dd7c-hrt7t" Apr 23 08:17:53.445823 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.445447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d4494dd7c-hrt7t" event={"ID":"95bd7a8f-da39-4894-8a21-8bcf1a410653","Type":"ContainerDied","Data":"0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c"} Apr 23 08:17:53.445823 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.445480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d4494dd7c-hrt7t" event={"ID":"95bd7a8f-da39-4894-8a21-8bcf1a410653","Type":"ContainerDied","Data":"f61617c45384912860b1dae3ec63bcfaa1e8ddadf352a200641821584b7f0585"} Apr 23 08:17:53.445823 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.445495 2575 scope.go:117] "RemoveContainer" containerID="0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c" Apr 23 08:17:53.453500 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.453478 2575 scope.go:117] "RemoveContainer" containerID="0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c" Apr 23 08:17:53.453802 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:17:53.453775 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c\": container with ID starting with 0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c not found: ID does not exist" containerID="0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c" Apr 23 08:17:53.453869 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.453816 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c"} err="failed to get container status \"0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c\": rpc error: code = NotFound desc = could not find container \"0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c\": container with ID starting with 0a3a42b567eea14760a639f42d1a89f7908067eb8ae58e0ee51f532fc12ace3c not found: ID does not exist" Apr 23 08:17:53.463214 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.463186 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d4494dd7c-hrt7t"] Apr 23 08:17:53.467262 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:53.467240 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d4494dd7c-hrt7t"] Apr 23 08:17:54.777019 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:54.776988 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95bd7a8f-da39-4894-8a21-8bcf1a410653" path="/var/lib/kubelet/pods/95bd7a8f-da39-4894-8a21-8bcf1a410653/volumes" Apr 23 08:17:58.909244 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:58.909210 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:17:58.913210 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:17:58.913189 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-989d867f7-8tp78" Apr 23 08:18:20.645483 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:20.645437 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:20.660939 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:20.660909 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:21.544014 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:21.543987 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:34.595259 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:34.595223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:18:34.597502 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:34.597471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75af54d6-d4ae-4e8e-bf63-80cc7a54fe63-metrics-certs\") pod \"network-metrics-daemon-gtsb8\" (UID: \"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63\") " pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:18:34.678424 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:34.678390 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vjj4z\"" Apr 23 08:18:34.685720 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:34.685702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtsb8" Apr 23 08:18:34.806991 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:34.806961 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gtsb8"] Apr 23 08:18:34.810234 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:18:34.810204 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75af54d6_d4ae_4e8e_bf63_80cc7a54fe63.slice/crio-b3192a38079f1d10d313b2374ae761176057a4b0c70502952f90f1ce6028e756 WatchSource:0}: Error finding container b3192a38079f1d10d313b2374ae761176057a4b0c70502952f90f1ce6028e756: Status 404 returned error can't find the container with id b3192a38079f1d10d313b2374ae761176057a4b0c70502952f90f1ce6028e756 Apr 23 08:18:35.577355 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:35.577280 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gtsb8" event={"ID":"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63","Type":"ContainerStarted","Data":"b3192a38079f1d10d313b2374ae761176057a4b0c70502952f90f1ce6028e756"} Apr 23 08:18:36.581954 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:36.581906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gtsb8" event={"ID":"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63","Type":"ContainerStarted","Data":"5247e30c59988bf61ad30b32d06766100f25c9ca47a2ea09f0526ec27343de9f"} Apr 23 08:18:36.581954 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:36.581952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gtsb8" event={"ID":"75af54d6-d4ae-4e8e-bf63-80cc7a54fe63","Type":"ContainerStarted","Data":"e9f8f5a3f28fca0f6df6d3c5d94d80f4a26ca7dc6605c38115f4e769108a6e7a"} Apr 23 08:18:36.602173 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:36.602123 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gtsb8" podStartSLOduration=253.703922437 podStartE2EDuration="4m14.602108354s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:18:34.812352728 +0000 UTC m=+252.610743178" lastFinishedPulling="2026-04-23 08:18:35.710538647 +0000 UTC m=+253.508929095" observedRunningTime="2026-04-23 08:18:36.599704616 +0000 UTC m=+254.398095084" watchObservedRunningTime="2026-04-23 08:18:36.602108354 +0000 UTC m=+254.400498827" Apr 23 08:18:38.664166 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:38.664115 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:18:38.664685 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:38.664651 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="prometheus" containerID="cri-o://806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" gracePeriod=600 Apr 23 08:18:38.664777 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:38.664687 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="thanos-sidecar" containerID="cri-o://3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" gracePeriod=600 Apr 23 08:18:38.664777 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:38.664686 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="config-reloader" containerID="cri-o://4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" gracePeriod=600 Apr 23 08:18:38.664878 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:38.664791 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy-web" containerID="cri-o://d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" gracePeriod=600 Apr 23 08:18:38.664878 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:38.664825 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy-thanos" containerID="cri-o://4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" gracePeriod=600 Apr 23 08:18:38.665023 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:38.664668 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy" containerID="cri-o://23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" gracePeriod=600 Apr 23 08:18:38.922659 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:38.922626 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.034789 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.034753 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxt8r\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-kube-api-access-kxt8r\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.034789 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.034792 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-kubelet-serving-ca-bundle\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035036 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.034823 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-metrics-client-certs\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035036 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.034844 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-tls\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035036 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.034945 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-grpc-tls\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035036 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035003 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-rulefiles-0\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035036 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035032 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-metrics-client-ca\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035334 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035063 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-tls-assets\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035334 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035090 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-serving-certs-ca-bundle\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035334 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035126 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-db\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035334 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035153 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-config\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035334 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035184 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-web-config\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035334 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035206 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035334 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035224 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:39.035334 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035240 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-thanos-prometheus-http-client-file\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035734 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035339 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-trusted-ca-bundle\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035734 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035375 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-config-out\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035734 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035400 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-kube-rbac-proxy\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035734 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035421 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:39.035734 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035439 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f6d02624-cb56-48c2-b2d6-afa71c610d89\" (UID: \"f6d02624-cb56-48c2-b2d6-afa71c610d89\") " Apr 23 08:18:39.035734 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035727 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.036030 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.035748 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-metrics-client-ca\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.037256 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.036185 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:39.037256 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.036325 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:18:39.037256 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.036618 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:39.037710 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.037676 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:39.037997 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.037929 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.038143 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.038112 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.038452 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.038416 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-kube-api-access-kxt8r" (OuterVolumeSpecName: "kube-api-access-kxt8r") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "kube-api-access-kxt8r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:18:39.038558 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.038454 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.038783 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.038753 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.039002 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.038979 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.039283 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.039248 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:18:39.039486 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.039467 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-config-out" (OuterVolumeSpecName: "config-out") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:18:39.039548 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.039484 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-config" (OuterVolumeSpecName: "config") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.039753 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.039738 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.040062 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.040040 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.050420 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.050397 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-web-config" (OuterVolumeSpecName: "web-config") pod "f6d02624-cb56-48c2-b2d6-afa71c610d89" (UID: "f6d02624-cb56-48c2-b2d6-afa71c610d89"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:39.136126 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136089 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kxt8r\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-kube-api-access-kxt8r\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136126 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136118 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-metrics-client-certs\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136126 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136128 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136139 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-grpc-tls\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136148 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136159 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6d02624-cb56-48c2-b2d6-afa71c610d89-tls-assets\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136169 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136179 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-k8s-db\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136187 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-config\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136196 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-web-config\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136205 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136214 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136223 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6d02624-cb56-48c2-b2d6-afa71c610d89-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136233 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6d02624-cb56-48c2-b2d6-afa71c610d89-config-out\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136241 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-kube-rbac-proxy\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.136379 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.136250 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6d02624-cb56-48c2-b2d6-afa71c610d89-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:18:39.595141 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595104 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" exitCode=0 Apr 23 08:18:39.595141 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595134 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" exitCode=0 Apr 23 08:18:39.595141 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595142 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" exitCode=0 Apr 23 08:18:39.595141 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595150 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" exitCode=0 Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595160 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" exitCode=0 Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595168 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" exitCode=0 Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerDied","Data":"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d"} Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerDied","Data":"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541"} Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595237 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerDied","Data":"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b"} Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595247 2575 scope.go:117] "RemoveContainer" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerDied","Data":"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6"} Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerDied","Data":"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d"} Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595349 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerDied","Data":"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6"} Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6d02624-cb56-48c2-b2d6-afa71c610d89","Type":"ContainerDied","Data":"f4899c3a12a4d9890518988b20930f50b8306cca669ca8900de2b013a4459819"} Apr 23 08:18:39.595469 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.595236 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.604971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.604949 2575 scope.go:117] "RemoveContainer" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" Apr 23 08:18:39.613386 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.613367 2575 scope.go:117] "RemoveContainer" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" Apr 23 08:18:39.619420 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.619403 2575 scope.go:117] "RemoveContainer" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" Apr 23 08:18:39.624470 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.624443 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:18:39.626282 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.626260 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:18:39.629446 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.629423 2575 scope.go:117] "RemoveContainer" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" Apr 23 08:18:39.636031 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.636014 2575 scope.go:117] "RemoveContainer" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" Apr 23 08:18:39.642583 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.642567 2575 scope.go:117] "RemoveContainer" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" Apr 23 08:18:39.648644 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.648626 2575 scope.go:117] "RemoveContainer" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" Apr 23 08:18:39.648878 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:18:39.648862 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": container with ID starting with 4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d not found: ID does not exist" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" Apr 23 08:18:39.648920 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.648887 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d"} err="failed to get container status \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": rpc error: code = NotFound desc = could not find container \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": container with ID starting with 4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d not found: ID does not exist" Apr 23 08:18:39.648920 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.648906 2575 scope.go:117] "RemoveContainer" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" Apr 23 08:18:39.649143 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:18:39.649125 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": container with ID starting with 23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541 not found: ID does not exist" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" Apr 23 08:18:39.649182 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.649155 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541"} err="failed to get container status \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": rpc error: code = NotFound desc = could not find container \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": container with ID starting with 23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541 not found: ID does not exist" Apr 23 08:18:39.649182 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.649180 2575 scope.go:117] "RemoveContainer" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" Apr 23 08:18:39.649418 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:18:39.649401 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": container with ID starting with d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b not found: ID does not exist" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" Apr 23 08:18:39.649467 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.649424 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b"} err="failed to get container status \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": rpc error: code = NotFound desc = could not find container \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": container with ID starting with d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b not found: ID does not exist" Apr 23 08:18:39.649467 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.649439 2575 scope.go:117] "RemoveContainer" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" Apr 23 08:18:39.649658 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:18:39.649641 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": container with ID starting with 3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6 not found: ID does not exist" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" Apr 23 08:18:39.649714 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.649666 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6"} err="failed to get container status \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": rpc error: code = NotFound desc = could not find container \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": container with ID starting with 3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6 not found: ID does not exist" Apr 23 08:18:39.649714 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.649686 2575 scope.go:117] "RemoveContainer" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" Apr 23 08:18:39.649922 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:18:39.649905 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": container with ID starting with 4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d not found: ID does not exist" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" Apr 23 08:18:39.649959 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.649926 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d"} err="failed to get container status \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": rpc error: code = NotFound desc = could not find container \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": container with ID starting with 4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d not found: ID does not exist" Apr 23 08:18:39.649959 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.649939 2575 scope.go:117] "RemoveContainer" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" Apr 23 08:18:39.650129 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:18:39.650115 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": container with ID starting with 806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6 not found: ID does not exist" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" Apr 23 08:18:39.650170 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.650133 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6"} err="failed to get container status \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": rpc error: code = NotFound desc = could not find container \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": container with ID starting with 806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6 not found: ID does not exist" Apr 23 08:18:39.650170 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.650146 2575 scope.go:117] "RemoveContainer" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" Apr 23 08:18:39.650431 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:18:39.650414 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": container with ID starting with 57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec not found: ID does not exist" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" Apr 23 08:18:39.650502 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.650437 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec"} err="failed to get container status \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": rpc error: code = NotFound desc = could not find container \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": container with ID starting with 57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec not found: ID does not exist" Apr 23 08:18:39.650502 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.650455 2575 scope.go:117] "RemoveContainer" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" Apr 23 08:18:39.650656 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.650640 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d"} err="failed to get container status \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": rpc error: code = NotFound desc = could not find container \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": container with ID starting with 4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d not found: ID does not exist" Apr 23 08:18:39.650712 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.650658 2575 scope.go:117] "RemoveContainer" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" Apr 23 08:18:39.650852 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.650836 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541"} err="failed to get container status \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": rpc error: code = NotFound desc = could not find container \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": container with ID starting with 23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541 not found: ID does not exist" Apr 23 08:18:39.650913 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.650853 2575 scope.go:117] "RemoveContainer" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" Apr 23 08:18:39.651032 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.651017 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b"} err="failed to get container status \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": rpc error: code = NotFound desc = could not find container \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": container with ID starting with d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b not found: ID does not exist" Apr 23 08:18:39.651092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.651033 2575 scope.go:117] "RemoveContainer" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" Apr 23 08:18:39.651234 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.651208 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6"} err="failed to get container status \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": rpc error: code = NotFound desc = could not find container \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": container with ID starting with 3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6 not found: ID does not exist" Apr 23 08:18:39.651234 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.651225 2575 scope.go:117] "RemoveContainer" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" Apr 23 08:18:39.651554 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.651476 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d"} err="failed to get container status \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": rpc error: code = NotFound desc = could not find container \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": container with ID starting with 4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d not found: ID does not exist" Apr 23 08:18:39.651554 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.651533 2575 scope.go:117] "RemoveContainer" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" Apr 23 08:18:39.651894 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.651864 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6"} err="failed to get container status \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": rpc error: code = NotFound desc = could not find container \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": container with ID starting with 806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6 not found: ID does not exist" Apr 23 08:18:39.651894 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.651888 2575 scope.go:117] "RemoveContainer" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" Apr 23 08:18:39.652150 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.652124 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec"} err="failed to get container status \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": rpc error: code = NotFound desc = could not find container \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": container with ID starting with 57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec not found: ID does not exist" Apr 23 08:18:39.652272 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.652253 2575 scope.go:117] "RemoveContainer" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" Apr 23 08:18:39.652592 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.652570 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d"} err="failed to get container status \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": rpc error: code = NotFound desc = could not find container \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": container with ID starting with 4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d not found: ID does not exist" Apr 23 08:18:39.652592 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.652592 2575 scope.go:117] "RemoveContainer" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" Apr 23 08:18:39.652975 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.652941 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541"} err="failed to get container status \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": rpc error: code = NotFound desc = could not find container \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": container with ID starting with 23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541 not found: ID does not exist" Apr 23 08:18:39.653046 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.652976 2575 scope.go:117] "RemoveContainer" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" Apr 23 08:18:39.653223 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653202 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b"} err="failed to get container status \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": rpc error: code = NotFound desc = could not find container \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": container with ID starting with d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b not found: ID does not exist" Apr 23 08:18:39.653336 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653222 2575 scope.go:117] "RemoveContainer" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" Apr 23 08:18:39.653456 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653441 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:18:39.653568 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653482 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6"} err="failed to get container status \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": rpc error: code = NotFound desc = could not find container \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": container with ID starting with 3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6 not found: ID does not exist" Apr 23 08:18:39.653568 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653504 2575 scope.go:117] "RemoveContainer" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" Apr 23 08:18:39.653760 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653734 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d"} err="failed to get container status \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": rpc error: code = NotFound desc = could not find container \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": container with ID starting with 4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d not found: ID does not exist" Apr 23 08:18:39.653805 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653762 2575 scope.go:117] "RemoveContainer" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" Apr 23 08:18:39.653805 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653768 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy-thanos" Apr 23 08:18:39.653805 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653782 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy-thanos" Apr 23 08:18:39.653805 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653796 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="thanos-sidecar" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653805 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="thanos-sidecar" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653820 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="prometheus" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653828 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="prometheus" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653837 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="config-reloader" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653844 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="config-reloader" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653851 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95bd7a8f-da39-4894-8a21-8bcf1a410653" containerName="console" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653856 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bd7a8f-da39-4894-8a21-8bcf1a410653" containerName="console" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653872 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy-web" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653880 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy-web" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653895 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="init-config-reloader" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653903 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="init-config-reloader" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653910 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653917 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy" Apr 23 08:18:39.653971 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653961 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6"} err="failed to get container status \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": rpc error: code = NotFound desc = could not find container \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": container with ID starting with 806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6 not found: ID does not exist" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653983 2575 scope.go:117] "RemoveContainer" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.653974 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="95bd7a8f-da39-4894-8a21-8bcf1a410653" containerName="console" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654045 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy-web" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654061 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="config-reloader" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654072 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy-thanos" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654083 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="prometheus" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654094 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="thanos-sidecar" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654103 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" containerName="kube-rbac-proxy" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654370 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec"} err="failed to get container status \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": rpc error: code = NotFound desc = could not find container \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": container with ID starting with 57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec not found: ID does not exist" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654392 2575 scope.go:117] "RemoveContainer" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654773 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d"} err="failed to get container status \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": rpc error: code = NotFound desc = could not find container \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": container with ID starting with 4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d not found: ID does not exist" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.654795 2575 scope.go:117] "RemoveContainer" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655026 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541"} err="failed to get container status \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": rpc error: code = NotFound desc = could not find container \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": container with ID starting with 23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541 not found: ID does not exist" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655044 2575 scope.go:117] "RemoveContainer" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655263 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b"} err="failed to get container status \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": rpc error: code = NotFound desc = could not find container \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": container with ID starting with d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b not found: ID does not exist" Apr 23 08:18:39.655344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655280 2575 scope.go:117] "RemoveContainer" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" Apr 23 08:18:39.655882 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655515 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6"} err="failed to get container status \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": rpc error: code = NotFound desc = could not find container \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": container with ID starting with 3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6 not found: ID does not exist" Apr 23 08:18:39.655882 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655538 2575 scope.go:117] "RemoveContainer" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" Apr 23 08:18:39.655882 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655751 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d"} err="failed to get container status \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": rpc error: code = NotFound desc = could not find container \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": container with ID starting with 4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d not found: ID does not exist" Apr 23 08:18:39.655882 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655767 2575 scope.go:117] "RemoveContainer" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" Apr 23 08:18:39.656007 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655972 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6"} err="failed to get container status \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": rpc error: code = NotFound desc = could not find container \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": container with ID starting with 806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6 not found: ID does not exist" Apr 23 08:18:39.656007 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.655989 2575 scope.go:117] "RemoveContainer" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" Apr 23 08:18:39.656247 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.656221 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec"} err="failed to get container status \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": rpc error: code = NotFound desc = could not find container \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": container with ID starting with 57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec not found: ID does not exist" Apr 23 08:18:39.656333 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.656249 2575 scope.go:117] "RemoveContainer" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" Apr 23 08:18:39.656525 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.656502 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d"} err="failed to get container status \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": rpc error: code = NotFound desc = could not find container \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": container with ID starting with 4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d not found: ID does not exist" Apr 23 08:18:39.656601 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.656527 2575 scope.go:117] "RemoveContainer" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" Apr 23 08:18:39.656758 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.656740 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541"} err="failed to get container status \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": rpc error: code = NotFound desc = could not find container \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": container with ID starting with 23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541 not found: ID does not exist" Apr 23 08:18:39.656811 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.656759 2575 scope.go:117] "RemoveContainer" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" Apr 23 08:18:39.657000 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.656983 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b"} err="failed to get container status \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": rpc error: code = NotFound desc = could not find container \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": container with ID starting with d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b not found: ID does not exist" Apr 23 08:18:39.657047 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657001 2575 scope.go:117] "RemoveContainer" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" Apr 23 08:18:39.657237 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657221 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6"} err="failed to get container status \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": rpc error: code = NotFound desc = could not find container \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": container with ID starting with 3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6 not found: ID does not exist" Apr 23 08:18:39.657277 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657238 2575 scope.go:117] "RemoveContainer" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" Apr 23 08:18:39.657451 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657436 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d"} err="failed to get container status \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": rpc error: code = NotFound desc = could not find container \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": container with ID starting with 4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d not found: ID does not exist" Apr 23 08:18:39.657494 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657453 2575 scope.go:117] "RemoveContainer" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" Apr 23 08:18:39.657672 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657656 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6"} err="failed to get container status \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": rpc error: code = NotFound desc = could not find container \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": container with ID starting with 806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6 not found: ID does not exist" Apr 23 08:18:39.657714 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657673 2575 scope.go:117] "RemoveContainer" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" Apr 23 08:18:39.657863 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657844 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec"} err="failed to get container status \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": rpc error: code = NotFound desc = could not find container \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": container with ID starting with 57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec not found: ID does not exist" Apr 23 08:18:39.657905 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.657863 2575 scope.go:117] "RemoveContainer" containerID="4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d" Apr 23 08:18:39.658037 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658020 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d"} err="failed to get container status \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": rpc error: code = NotFound desc = could not find container \"4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d\": container with ID starting with 4e92e465c2b5e1217e998bd02b7a0d0a29033e51299a9522a0835144358c0e1d not found: ID does not exist" Apr 23 08:18:39.658037 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658035 2575 scope.go:117] "RemoveContainer" containerID="23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541" Apr 23 08:18:39.658203 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658187 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541"} err="failed to get container status \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": rpc error: code = NotFound desc = could not find container \"23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541\": container with ID starting with 23b836dd6abedc386141f48d9b51360cc41af0acf768899a79d17f60aab5e541 not found: ID does not exist" Apr 23 08:18:39.658203 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658200 2575 scope.go:117] "RemoveContainer" containerID="d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b" Apr 23 08:18:39.658426 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658411 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b"} err="failed to get container status \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": rpc error: code = NotFound desc = could not find container \"d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b\": container with ID starting with d7900ce476051c0584f7c83f4faf7a161a0a5941383e00721f8a20d5da50901b not found: ID does not exist" Apr 23 08:18:39.658426 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658426 2575 scope.go:117] "RemoveContainer" containerID="3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6" Apr 23 08:18:39.658620 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658606 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6"} err="failed to get container status \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": rpc error: code = NotFound desc = could not find container \"3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6\": container with ID starting with 3f443626fe05adcb613ff4de02f6f8e08b245952b2cdc9f30e5fb9faeb75c2d6 not found: ID does not exist" Apr 23 08:18:39.658663 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658620 2575 scope.go:117] "RemoveContainer" containerID="4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d" Apr 23 08:18:39.658821 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658803 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d"} err="failed to get container status \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": rpc error: code = NotFound desc = could not find container \"4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d\": container with ID starting with 4c2590956d66c66678db59227845a98711d308edb198d37bd2ced45de838209d not found: ID does not exist" Apr 23 08:18:39.658870 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658821 2575 scope.go:117] "RemoveContainer" containerID="806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6" Apr 23 08:18:39.659001 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.658986 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6"} err="failed to get container status \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": rpc error: code = NotFound desc = could not find container \"806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6\": container with ID starting with 806ed0dd409325c6883781fb6fe8d994743d432edf6b1698a6c25d8a967201b6 not found: ID does not exist" Apr 23 08:18:39.659041 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.659002 2575 scope.go:117] "RemoveContainer" containerID="57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec" Apr 23 08:18:39.659165 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.659150 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec"} err="failed to get container status \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": rpc error: code = NotFound desc = could not find container \"57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec\": container with ID starting with 57e01da64cf7e44171b1d49435b27964f076280d3be0d28b99ad2f2c9ada7dec not found: ID does not exist" Apr 23 08:18:39.660090 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.660071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.662896 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.662876 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 08:18:39.662991 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.662911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-570e3aeg8dk6j\"" Apr 23 08:18:39.662991 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.662970 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 08:18:39.663639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.663224 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 08:18:39.663639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.663245 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 08:18:39.663639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.663254 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 08:18:39.663639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.663477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 08:18:39.663639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.663498 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 08:18:39.663639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.663529 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 08:18:39.663639 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.663532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 08:18:39.664058 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.663889 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 08:18:39.664450 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.664434 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n4jfb\"" Apr 23 08:18:39.665802 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.665781 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 08:18:39.669519 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.669500 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 08:18:39.669687 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.669668 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:18:39.740658 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-config\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.740658 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.740897 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.740897 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/758f034c-8264-465e-b606-8e656b7d1424-config-out\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.740897 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.740897 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740855 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.740897 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/758f034c-8264-465e-b606-8e656b7d1424-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmqsp\" (UniqueName: \"kubernetes.io/projected/758f034c-8264-465e-b606-8e656b7d1424-kube-api-access-lmqsp\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.740985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.741004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.741029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/758f034c-8264-465e-b606-8e656b7d1424-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.741052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741077 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.741068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741404 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.741087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741404 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.741124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-web-config\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.741404 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.741163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842046 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842046 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/758f034c-8264-465e-b606-8e656b7d1424-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-web-config\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-config\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842281 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842629 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842629 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/758f034c-8264-465e-b606-8e656b7d1424-config-out\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842629 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842629 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842629 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842629 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842629 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/758f034c-8264-465e-b606-8e656b7d1424-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.842629 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.842481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmqsp\" (UniqueName: \"kubernetes.io/projected/758f034c-8264-465e-b606-8e656b7d1424-kube-api-access-lmqsp\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.843373 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.843324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.845527 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.845459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.845527 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.845476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/758f034c-8264-465e-b606-8e656b7d1424-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.845527 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.845504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.845730 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.845587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.845730 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.845620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-web-config\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.845923 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.845899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/758f034c-8264-465e-b606-8e656b7d1424-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.846151 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.846124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.846457 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.846428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.846457 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.846439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.846603 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.846529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.847078 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.847051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/758f034c-8264-465e-b606-8e656b7d1424-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.847171 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.847140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.848020 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.847992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.848197 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.848175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-config\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.848413 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.848396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/758f034c-8264-465e-b606-8e656b7d1424-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.848522 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.848459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/758f034c-8264-465e-b606-8e656b7d1424-config-out\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.850650 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.850629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmqsp\" (UniqueName: \"kubernetes.io/projected/758f034c-8264-465e-b606-8e656b7d1424-kube-api-access-lmqsp\") pod \"prometheus-k8s-0\" (UID: \"758f034c-8264-465e-b606-8e656b7d1424\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:39.970973 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:39.970930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:18:40.100565 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:40.100466 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:18:40.104680 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:18:40.104653 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758f034c_8264_465e_b606_8e656b7d1424.slice/crio-66d886a783e580fb1f261c7b7b7006d3e49ad73186720d09b5db50ed8678c56c WatchSource:0}: Error finding container 66d886a783e580fb1f261c7b7b7006d3e49ad73186720d09b5db50ed8678c56c: Status 404 returned error can't find the container with id 66d886a783e580fb1f261c7b7b7006d3e49ad73186720d09b5db50ed8678c56c Apr 23 08:18:40.599406 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:40.599369 2575 generic.go:358] "Generic (PLEG): container finished" podID="758f034c-8264-465e-b606-8e656b7d1424" containerID="68826dbc8fa6e8f2c1090c633815eae9ec9d229b831bba8253ae4ddefc2c4cac" exitCode=0 Apr 23 08:18:40.599595 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:40.599469 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"758f034c-8264-465e-b606-8e656b7d1424","Type":"ContainerDied","Data":"68826dbc8fa6e8f2c1090c633815eae9ec9d229b831bba8253ae4ddefc2c4cac"} Apr 23 08:18:40.599595 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:40.599508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"758f034c-8264-465e-b606-8e656b7d1424","Type":"ContainerStarted","Data":"66d886a783e580fb1f261c7b7b7006d3e49ad73186720d09b5db50ed8678c56c"} Apr 23 08:18:40.778385 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:40.778355 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d02624-cb56-48c2-b2d6-afa71c610d89" path="/var/lib/kubelet/pods/f6d02624-cb56-48c2-b2d6-afa71c610d89/volumes" Apr 23 08:18:41.606468 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:41.606431 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"758f034c-8264-465e-b606-8e656b7d1424","Type":"ContainerStarted","Data":"137f539e93cab44ab7b9511ead8b80359b6dafdf109e8dd27371f185deba68d3"} Apr 23 08:18:41.606468 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:41.606470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"758f034c-8264-465e-b606-8e656b7d1424","Type":"ContainerStarted","Data":"62022a76fd8e41204bc1027f3c9e0de1bcc4afb16d163430d134376950633f8c"} Apr 23 08:18:41.606668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:41.606480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"758f034c-8264-465e-b606-8e656b7d1424","Type":"ContainerStarted","Data":"18006f941ed4b4ee7beba48b0d232055d1935b0ca966f6fd2caa11c57b20e195"} Apr 23 08:18:41.606668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:41.606490 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"758f034c-8264-465e-b606-8e656b7d1424","Type":"ContainerStarted","Data":"78276a659fda250229f9b9684938ed6c39c3afba76c3e0a8d91bda106887a54a"} Apr 23 08:18:41.606668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:41.606498 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"758f034c-8264-465e-b606-8e656b7d1424","Type":"ContainerStarted","Data":"959a36a0c014bd4b21a61c133e189be277ee11db218d85d23437d8c1a407457d"} Apr 23 08:18:41.606668 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:41.606506 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"758f034c-8264-465e-b606-8e656b7d1424","Type":"ContainerStarted","Data":"815e73a47eada54da0e57a11e6c479409625b7cd98518fe0e2e49c56035b59b2"} Apr 23 08:18:41.634919 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:41.634860 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.634842146 podStartE2EDuration="2.634842146s" podCreationTimestamp="2026-04-23 08:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:18:41.63301089 +0000 UTC m=+259.431401357" watchObservedRunningTime="2026-04-23 08:18:41.634842146 +0000 UTC m=+259.433232617" Apr 23 08:18:44.971210 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:18:44.971177 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:19:22.668187 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:19:22.668161 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:19:39.971980 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:19:39.971936 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:19:39.986603 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:19:39.986578 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:19:40.795900 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:19:40.795866 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:21:17.043184 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.043151 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6"] Apr 23 08:21:17.046164 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.046145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.048904 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.048882 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 23 08:21:17.049901 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.049881 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 23 08:21:17.049901 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.049885 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 23 08:21:17.050049 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.049908 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 23 08:21:17.050049 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.049887 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-m6wzr\"" Apr 23 08:21:17.058394 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.058376 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6"] Apr 23 08:21:17.179455 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.179415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a226d354-93fe-46d5-9753-49d3657f216c-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.179612 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.179461 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a226d354-93fe-46d5-9753-49d3657f216c-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.179612 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.179495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljmjw\" (UniqueName: \"kubernetes.io/projected/a226d354-93fe-46d5-9753-49d3657f216c-kube-api-access-ljmjw\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.280171 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.280141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a226d354-93fe-46d5-9753-49d3657f216c-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.280363 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.280185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljmjw\" (UniqueName: \"kubernetes.io/projected/a226d354-93fe-46d5-9753-49d3657f216c-kube-api-access-ljmjw\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.280363 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.280277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a226d354-93fe-46d5-9753-49d3657f216c-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.280797 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.280765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a226d354-93fe-46d5-9753-49d3657f216c-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.282571 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.282550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a226d354-93fe-46d5-9753-49d3657f216c-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.288637 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.288616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljmjw\" (UniqueName: \"kubernetes.io/projected/a226d354-93fe-46d5-9753-49d3657f216c-kube-api-access-ljmjw\") pod \"kubeflow-trainer-controller-manager-55f5694779-rh4m6\" (UID: \"a226d354-93fe-46d5-9753-49d3657f216c\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.380191 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.380144 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:17.706855 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.706187 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6"] Apr 23 08:21:17.712285 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:21:17.712253 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda226d354_93fe_46d5_9753_49d3657f216c.slice/crio-c869603af7635151f82aa7a82a828b21a44e1d53d4fd46d6698b2a5b92aefa6f WatchSource:0}: Error finding container c869603af7635151f82aa7a82a828b21a44e1d53d4fd46d6698b2a5b92aefa6f: Status 404 returned error can't find the container with id c869603af7635151f82aa7a82a828b21a44e1d53d4fd46d6698b2a5b92aefa6f Apr 23 08:21:17.714095 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:17.714076 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:21:18.045413 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:18.045333 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" event={"ID":"a226d354-93fe-46d5-9753-49d3657f216c","Type":"ContainerStarted","Data":"c869603af7635151f82aa7a82a828b21a44e1d53d4fd46d6698b2a5b92aefa6f"} Apr 23 08:21:21.054138 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:21.054097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" event={"ID":"a226d354-93fe-46d5-9753-49d3657f216c","Type":"ContainerStarted","Data":"41d471a566bc12182a0d1aac474070cc32843fbfd55125ac6e587bacb04faa89"} Apr 23 08:21:21.054567 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:21.054233 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:21:21.070824 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:21.070774 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" podStartSLOduration=1.763442649 podStartE2EDuration="4.070759797s" podCreationTimestamp="2026-04-23 08:21:17 +0000 UTC" firstStartedPulling="2026-04-23 08:21:17.714236085 +0000 UTC m=+415.512626532" lastFinishedPulling="2026-04-23 08:21:20.021553231 +0000 UTC m=+417.819943680" observedRunningTime="2026-04-23 08:21:21.069053992 +0000 UTC m=+418.867444458" watchObservedRunningTime="2026-04-23 08:21:21.070759797 +0000 UTC m=+418.869150264" Apr 23 08:21:37.062220 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:21:37.062192 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-rh4m6" Apr 23 08:23:09.030057 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.030022 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz"] Apr 23 08:23:09.033564 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.033545 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" Apr 23 08:23:09.035967 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.035936 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-kzjxw\"/\"openshift-service-ca.crt\"" Apr 23 08:23:09.036120 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.036100 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-kzjxw\"/\"kube-root-ca.crt\"" Apr 23 08:23:09.037090 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.037076 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-kzjxw\"/\"default-dockercfg-w8z9z\"" Apr 23 08:23:09.040063 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.040045 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz"] Apr 23 08:23:09.193619 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.193587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt48q\" (UniqueName: \"kubernetes.io/projected/67b2eb03-50d8-44a9-a30e-a5d4f08d5d96-kube-api-access-gt48q\") pod \"test-trainjob-cvqh5-node-0-0-f7qpz\" (UID: \"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96\") " pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" Apr 23 08:23:09.294372 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.294260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt48q\" (UniqueName: \"kubernetes.io/projected/67b2eb03-50d8-44a9-a30e-a5d4f08d5d96-kube-api-access-gt48q\") pod \"test-trainjob-cvqh5-node-0-0-f7qpz\" (UID: \"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96\") " pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" Apr 23 08:23:09.303246 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.303216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt48q\" (UniqueName: \"kubernetes.io/projected/67b2eb03-50d8-44a9-a30e-a5d4f08d5d96-kube-api-access-gt48q\") pod \"test-trainjob-cvqh5-node-0-0-f7qpz\" (UID: \"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96\") " pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" Apr 23 08:23:09.343988 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.343963 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" Apr 23 08:23:09.462468 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:09.462443 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz"] Apr 23 08:23:09.464436 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:23:09.464403 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67b2eb03_50d8_44a9_a30e_a5d4f08d5d96.slice/crio-c3877c93efaf4c73af696ecc0ce21484ec5d82a799a8f7dadebf3a4917b2f4aa WatchSource:0}: Error finding container c3877c93efaf4c73af696ecc0ce21484ec5d82a799a8f7dadebf3a4917b2f4aa: Status 404 returned error can't find the container with id c3877c93efaf4c73af696ecc0ce21484ec5d82a799a8f7dadebf3a4917b2f4aa Apr 23 08:23:10.366607 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:23:10.366567 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" event={"ID":"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96","Type":"ContainerStarted","Data":"c3877c93efaf4c73af696ecc0ce21484ec5d82a799a8f7dadebf3a4917b2f4aa"} Apr 23 08:28:12.290697 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:12.290660 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" event={"ID":"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96","Type":"ContainerStarted","Data":"a727bdf8af75129436a726230c7b7cb04bf2934a9e8dda55a2d17733081c0db3"} Apr 23 08:28:12.315280 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:12.315225 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" podStartSLOduration=1.349617647 podStartE2EDuration="5m3.315212453s" podCreationTimestamp="2026-04-23 08:23:09 +0000 UTC" firstStartedPulling="2026-04-23 08:23:09.466161262 +0000 UTC m=+527.264551707" lastFinishedPulling="2026-04-23 08:28:11.431756068 +0000 UTC m=+829.230146513" observedRunningTime="2026-04-23 08:28:12.31335284 +0000 UTC m=+830.111743306" watchObservedRunningTime="2026-04-23 08:28:12.315212453 +0000 UTC m=+830.113602920" Apr 23 08:28:17.306567 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:17.306488 2575 generic.go:358] "Generic (PLEG): container finished" podID="67b2eb03-50d8-44a9-a30e-a5d4f08d5d96" containerID="a727bdf8af75129436a726230c7b7cb04bf2934a9e8dda55a2d17733081c0db3" exitCode=0 Apr 23 08:28:17.306567 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:17.306529 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" event={"ID":"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96","Type":"ContainerDied","Data":"a727bdf8af75129436a726230c7b7cb04bf2934a9e8dda55a2d17733081c0db3"} Apr 23 08:28:18.657575 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:18.657551 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" Apr 23 08:28:18.791582 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:18.791551 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt48q\" (UniqueName: \"kubernetes.io/projected/67b2eb03-50d8-44a9-a30e-a5d4f08d5d96-kube-api-access-gt48q\") pod \"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96\" (UID: \"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96\") " Apr 23 08:28:18.793641 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:18.793613 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b2eb03-50d8-44a9-a30e-a5d4f08d5d96-kube-api-access-gt48q" (OuterVolumeSpecName: "kube-api-access-gt48q") pod "67b2eb03-50d8-44a9-a30e-a5d4f08d5d96" (UID: "67b2eb03-50d8-44a9-a30e-a5d4f08d5d96"). InnerVolumeSpecName "kube-api-access-gt48q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:28:18.892877 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:18.892850 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gt48q\" (UniqueName: \"kubernetes.io/projected/67b2eb03-50d8-44a9-a30e-a5d4f08d5d96-kube-api-access-gt48q\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:28:19.313034 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.312954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" event={"ID":"67b2eb03-50d8-44a9-a30e-a5d4f08d5d96","Type":"ContainerDied","Data":"c3877c93efaf4c73af696ecc0ce21484ec5d82a799a8f7dadebf3a4917b2f4aa"} Apr 23 08:28:19.313034 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.312984 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3877c93efaf4c73af696ecc0ce21484ec5d82a799a8f7dadebf3a4917b2f4aa" Apr 23 08:28:19.313034 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.312984 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz" Apr 23 08:28:19.619623 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.619592 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw"] Apr 23 08:28:19.619922 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.619910 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67b2eb03-50d8-44a9-a30e-a5d4f08d5d96" containerName="node" Apr 23 08:28:19.619966 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.619924 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b2eb03-50d8-44a9-a30e-a5d4f08d5d96" containerName="node" Apr 23 08:28:19.620003 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.619980 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="67b2eb03-50d8-44a9-a30e-a5d4f08d5d96" containerName="node" Apr 23 08:28:19.686623 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.686593 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw"] Apr 23 08:28:19.686623 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.686608 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" Apr 23 08:28:19.690207 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.690186 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-nfslg\"/\"openshift-service-ca.crt\"" Apr 23 08:28:19.690207 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.690201 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-nfslg\"/\"kube-root-ca.crt\"" Apr 23 08:28:19.691285 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.691269 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-nfslg\"/\"default-dockercfg-m8zjm\"" Apr 23 08:28:19.800133 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.800101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gfx8\" (UniqueName: \"kubernetes.io/projected/572438f0-32c1-4ad1-8d04-867320107fdf-kube-api-access-7gfx8\") pod \"test-trainjob-zxl4c-node-0-0-8z2gw\" (UID: \"572438f0-32c1-4ad1-8d04-867320107fdf\") " pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" Apr 23 08:28:19.901510 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.901445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gfx8\" (UniqueName: \"kubernetes.io/projected/572438f0-32c1-4ad1-8d04-867320107fdf-kube-api-access-7gfx8\") pod \"test-trainjob-zxl4c-node-0-0-8z2gw\" (UID: \"572438f0-32c1-4ad1-8d04-867320107fdf\") " pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" Apr 23 08:28:19.910135 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.910110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gfx8\" (UniqueName: \"kubernetes.io/projected/572438f0-32c1-4ad1-8d04-867320107fdf-kube-api-access-7gfx8\") pod \"test-trainjob-zxl4c-node-0-0-8z2gw\" (UID: \"572438f0-32c1-4ad1-8d04-867320107fdf\") " pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" Apr 23 08:28:19.995534 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:19.995501 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" Apr 23 08:28:20.199482 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:20.199457 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw"] Apr 23 08:28:20.202147 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:28:20.202120 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod572438f0_32c1_4ad1_8d04_867320107fdf.slice/crio-e531b007ba5cda385bbb0ec427796de394b7b7269c966cb0fee17a7666b574c7 WatchSource:0}: Error finding container e531b007ba5cda385bbb0ec427796de394b7b7269c966cb0fee17a7666b574c7: Status 404 returned error can't find the container with id e531b007ba5cda385bbb0ec427796de394b7b7269c966cb0fee17a7666b574c7 Apr 23 08:28:20.204081 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:20.204064 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:28:20.316602 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:28:20.316573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" event={"ID":"572438f0-32c1-4ad1-8d04-867320107fdf","Type":"ContainerStarted","Data":"e531b007ba5cda385bbb0ec427796de394b7b7269c966cb0fee17a7666b574c7"} Apr 23 08:33:02.172092 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:02.172056 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" event={"ID":"572438f0-32c1-4ad1-8d04-867320107fdf","Type":"ContainerStarted","Data":"096c7aa34be26732e388e04b3412de984f125158015c3467b93f90ac9339168d"} Apr 23 08:33:02.200773 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:02.200725 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" podStartSLOduration=2.008678985 podStartE2EDuration="4m43.200705753s" podCreationTimestamp="2026-04-23 08:28:19 +0000 UTC" firstStartedPulling="2026-04-23 08:28:20.20418759 +0000 UTC m=+838.002578036" lastFinishedPulling="2026-04-23 08:33:01.396214344 +0000 UTC m=+1119.194604804" observedRunningTime="2026-04-23 08:33:02.197986524 +0000 UTC m=+1119.996376992" watchObservedRunningTime="2026-04-23 08:33:02.200705753 +0000 UTC m=+1119.999096219" Apr 23 08:33:09.193648 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:09.193613 2575 generic.go:358] "Generic (PLEG): container finished" podID="572438f0-32c1-4ad1-8d04-867320107fdf" containerID="096c7aa34be26732e388e04b3412de984f125158015c3467b93f90ac9339168d" exitCode=0 Apr 23 08:33:09.194080 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:09.193675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" event={"ID":"572438f0-32c1-4ad1-8d04-867320107fdf","Type":"ContainerDied","Data":"096c7aa34be26732e388e04b3412de984f125158015c3467b93f90ac9339168d"} Apr 23 08:33:10.324087 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:10.324067 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" Apr 23 08:33:10.422560 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:10.422531 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gfx8\" (UniqueName: \"kubernetes.io/projected/572438f0-32c1-4ad1-8d04-867320107fdf-kube-api-access-7gfx8\") pod \"572438f0-32c1-4ad1-8d04-867320107fdf\" (UID: \"572438f0-32c1-4ad1-8d04-867320107fdf\") " Apr 23 08:33:10.424622 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:10.424596 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/572438f0-32c1-4ad1-8d04-867320107fdf-kube-api-access-7gfx8" (OuterVolumeSpecName: "kube-api-access-7gfx8") pod "572438f0-32c1-4ad1-8d04-867320107fdf" (UID: "572438f0-32c1-4ad1-8d04-867320107fdf"). InnerVolumeSpecName "kube-api-access-7gfx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:33:10.523755 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:10.523690 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7gfx8\" (UniqueName: \"kubernetes.io/projected/572438f0-32c1-4ad1-8d04-867320107fdf-kube-api-access-7gfx8\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:33:11.201067 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:11.201037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" event={"ID":"572438f0-32c1-4ad1-8d04-867320107fdf","Type":"ContainerDied","Data":"e531b007ba5cda385bbb0ec427796de394b7b7269c966cb0fee17a7666b574c7"} Apr 23 08:33:11.201067 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:11.201070 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e531b007ba5cda385bbb0ec427796de394b7b7269c966cb0fee17a7666b574c7" Apr 23 08:33:11.201255 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:11.201076 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw" Apr 23 08:33:12.123423 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.123389 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj"] Apr 23 08:33:12.123801 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.123692 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="572438f0-32c1-4ad1-8d04-867320107fdf" containerName="node" Apr 23 08:33:12.123801 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.123704 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="572438f0-32c1-4ad1-8d04-867320107fdf" containerName="node" Apr 23 08:33:12.123801 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.123769 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="572438f0-32c1-4ad1-8d04-867320107fdf" containerName="node" Apr 23 08:33:12.280285 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.280252 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj"] Apr 23 08:33:12.280458 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.280332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" Apr 23 08:33:12.283422 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.283400 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-j9crz\"/\"openshift-service-ca.crt\"" Apr 23 08:33:12.283559 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.283437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-j9crz\"/\"default-dockercfg-65lnn\"" Apr 23 08:33:12.283559 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.283405 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-j9crz\"/\"kube-root-ca.crt\"" Apr 23 08:33:12.339823 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.339793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp8k9\" (UniqueName: \"kubernetes.io/projected/9413768a-0739-4b86-b01e-9e22013bf338-kube-api-access-lp8k9\") pod \"test-trainjob-zw29c-node-0-0-f5dtj\" (UID: \"9413768a-0739-4b86-b01e-9e22013bf338\") " pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" Apr 23 08:33:12.441352 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.441260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp8k9\" (UniqueName: \"kubernetes.io/projected/9413768a-0739-4b86-b01e-9e22013bf338-kube-api-access-lp8k9\") pod \"test-trainjob-zw29c-node-0-0-f5dtj\" (UID: \"9413768a-0739-4b86-b01e-9e22013bf338\") " pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" Apr 23 08:33:12.450518 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.450496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp8k9\" (UniqueName: \"kubernetes.io/projected/9413768a-0739-4b86-b01e-9e22013bf338-kube-api-access-lp8k9\") pod \"test-trainjob-zw29c-node-0-0-f5dtj\" (UID: \"9413768a-0739-4b86-b01e-9e22013bf338\") " pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" Apr 23 08:33:12.589769 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.589736 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" Apr 23 08:33:12.708665 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:12.708601 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj"] Apr 23 08:33:12.710986 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:33:12.710961 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9413768a_0739_4b86_b01e_9e22013bf338.slice/crio-2b8f8ef75cddf689b71548bd6354cedaacc4e04e51e92712773a3aa9888e9d14 WatchSource:0}: Error finding container 2b8f8ef75cddf689b71548bd6354cedaacc4e04e51e92712773a3aa9888e9d14: Status 404 returned error can't find the container with id 2b8f8ef75cddf689b71548bd6354cedaacc4e04e51e92712773a3aa9888e9d14 Apr 23 08:33:13.208258 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:33:13.208223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" event={"ID":"9413768a-0739-4b86-b01e-9e22013bf338","Type":"ContainerStarted","Data":"2b8f8ef75cddf689b71548bd6354cedaacc4e04e51e92712773a3aa9888e9d14"} Apr 23 08:34:33.477235 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:33.477133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" event={"ID":"9413768a-0739-4b86-b01e-9e22013bf338","Type":"ContainerStarted","Data":"9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c"} Apr 23 08:34:33.497648 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:33.497590 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" podStartSLOduration=1.091699626 podStartE2EDuration="1m21.497571582s" podCreationTimestamp="2026-04-23 08:33:12 +0000 UTC" firstStartedPulling="2026-04-23 08:33:12.71344387 +0000 UTC m=+1130.511834315" lastFinishedPulling="2026-04-23 08:34:33.119315825 +0000 UTC m=+1210.917706271" observedRunningTime="2026-04-23 08:34:33.496972916 +0000 UTC m=+1211.295363385" watchObservedRunningTime="2026-04-23 08:34:33.497571582 +0000 UTC m=+1211.295962050" Apr 23 08:34:36.487730 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:36.487692 2575 generic.go:358] "Generic (PLEG): container finished" podID="9413768a-0739-4b86-b01e-9e22013bf338" containerID="9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c" exitCode=0 Apr 23 08:34:36.488197 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:36.487772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" event={"ID":"9413768a-0739-4b86-b01e-9e22013bf338","Type":"ContainerDied","Data":"9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c"} Apr 23 08:34:37.614258 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:37.614235 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" Apr 23 08:34:37.688598 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:37.688557 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp8k9\" (UniqueName: \"kubernetes.io/projected/9413768a-0739-4b86-b01e-9e22013bf338-kube-api-access-lp8k9\") pod \"9413768a-0739-4b86-b01e-9e22013bf338\" (UID: \"9413768a-0739-4b86-b01e-9e22013bf338\") " Apr 23 08:34:37.690661 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:37.690633 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9413768a-0739-4b86-b01e-9e22013bf338-kube-api-access-lp8k9" (OuterVolumeSpecName: "kube-api-access-lp8k9") pod "9413768a-0739-4b86-b01e-9e22013bf338" (UID: "9413768a-0739-4b86-b01e-9e22013bf338"). InnerVolumeSpecName "kube-api-access-lp8k9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:34:37.790123 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:37.790031 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lp8k9\" (UniqueName: \"kubernetes.io/projected/9413768a-0739-4b86-b01e-9e22013bf338-kube-api-access-lp8k9\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:34:38.494513 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:38.494483 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" Apr 23 08:34:38.494513 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:38.494491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj" event={"ID":"9413768a-0739-4b86-b01e-9e22013bf338","Type":"ContainerDied","Data":"2b8f8ef75cddf689b71548bd6354cedaacc4e04e51e92712773a3aa9888e9d14"} Apr 23 08:34:38.494513 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:38.494520 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b8f8ef75cddf689b71548bd6354cedaacc4e04e51e92712773a3aa9888e9d14" Apr 23 08:34:39.450219 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.450181 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs"] Apr 23 08:34:39.450691 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.450537 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9413768a-0739-4b86-b01e-9e22013bf338" containerName="node" Apr 23 08:34:39.450691 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.450550 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9413768a-0739-4b86-b01e-9e22013bf338" containerName="node" Apr 23 08:34:39.450691 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.450617 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9413768a-0739-4b86-b01e-9e22013bf338" containerName="node" Apr 23 08:34:39.524673 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.524639 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs"] Apr 23 08:34:39.524832 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.524755 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" Apr 23 08:34:39.527574 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.527551 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-ccmmn\"/\"kube-root-ca.crt\"" Apr 23 08:34:39.528632 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.528613 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-ccmmn\"/\"default-dockercfg-mzt67\"" Apr 23 08:34:39.528733 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.528613 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-ccmmn\"/\"openshift-service-ca.crt\"" Apr 23 08:34:39.607051 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.607016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfxlq\" (UniqueName: \"kubernetes.io/projected/48eed880-787e-4f3d-b86d-6f0898e86459-kube-api-access-tfxlq\") pod \"test-trainjob-f822p-node-0-0-b4vcs\" (UID: \"48eed880-787e-4f3d-b86d-6f0898e86459\") " pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" Apr 23 08:34:39.707723 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.707645 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfxlq\" (UniqueName: \"kubernetes.io/projected/48eed880-787e-4f3d-b86d-6f0898e86459-kube-api-access-tfxlq\") pod \"test-trainjob-f822p-node-0-0-b4vcs\" (UID: \"48eed880-787e-4f3d-b86d-6f0898e86459\") " pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" Apr 23 08:34:39.716996 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.716968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfxlq\" (UniqueName: \"kubernetes.io/projected/48eed880-787e-4f3d-b86d-6f0898e86459-kube-api-access-tfxlq\") pod \"test-trainjob-f822p-node-0-0-b4vcs\" (UID: \"48eed880-787e-4f3d-b86d-6f0898e86459\") " pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" Apr 23 08:34:39.833344 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.833284 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" Apr 23 08:34:39.954802 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.954779 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs"] Apr 23 08:34:39.957085 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:34:39.957052 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48eed880_787e_4f3d_b86d_6f0898e86459.slice/crio-3b79c27fb4051ec4be3e385d9087d0706e5dee968c0e7e1caf77501fcc22590c WatchSource:0}: Error finding container 3b79c27fb4051ec4be3e385d9087d0706e5dee968c0e7e1caf77501fcc22590c: Status 404 returned error can't find the container with id 3b79c27fb4051ec4be3e385d9087d0706e5dee968c0e7e1caf77501fcc22590c Apr 23 08:34:39.959101 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:39.959085 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:34:40.501534 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:34:40.501500 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" event={"ID":"48eed880-787e-4f3d-b86d-6f0898e86459","Type":"ContainerStarted","Data":"3b79c27fb4051ec4be3e385d9087d0706e5dee968c0e7e1caf77501fcc22590c"} Apr 23 08:42:02.997215 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:02.997181 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" event={"ID":"48eed880-787e-4f3d-b86d-6f0898e86459","Type":"ContainerStarted","Data":"53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a"} Apr 23 08:42:03.000066 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:03.000049 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-ccmmn\"/\"default-dockercfg-mzt67\"" Apr 23 08:42:03.026189 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:03.026139 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" podStartSLOduration=1.462176247 podStartE2EDuration="7m24.026123554s" podCreationTimestamp="2026-04-23 08:34:39 +0000 UTC" firstStartedPulling="2026-04-23 08:34:39.959206747 +0000 UTC m=+1217.757597192" lastFinishedPulling="2026-04-23 08:42:02.523154041 +0000 UTC m=+1660.321544499" observedRunningTime="2026-04-23 08:42:03.024811523 +0000 UTC m=+1660.823201991" watchObservedRunningTime="2026-04-23 08:42:03.026123554 +0000 UTC m=+1660.824514032" Apr 23 08:42:03.104498 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:03.104467 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-ccmmn\"/\"kube-root-ca.crt\"" Apr 23 08:42:03.114145 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:03.114129 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-ccmmn\"/\"openshift-service-ca.crt\"" Apr 23 08:42:07.016974 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:07.016893 2575 generic.go:358] "Generic (PLEG): container finished" podID="48eed880-787e-4f3d-b86d-6f0898e86459" containerID="53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a" exitCode=0 Apr 23 08:42:07.017326 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:07.016974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" event={"ID":"48eed880-787e-4f3d-b86d-6f0898e86459","Type":"ContainerDied","Data":"53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a"} Apr 23 08:42:08.345245 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:08.345222 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" Apr 23 08:42:08.445837 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:08.445807 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfxlq\" (UniqueName: \"kubernetes.io/projected/48eed880-787e-4f3d-b86d-6f0898e86459-kube-api-access-tfxlq\") pod \"48eed880-787e-4f3d-b86d-6f0898e86459\" (UID: \"48eed880-787e-4f3d-b86d-6f0898e86459\") " Apr 23 08:42:08.448174 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:08.448146 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48eed880-787e-4f3d-b86d-6f0898e86459-kube-api-access-tfxlq" (OuterVolumeSpecName: "kube-api-access-tfxlq") pod "48eed880-787e-4f3d-b86d-6f0898e86459" (UID: "48eed880-787e-4f3d-b86d-6f0898e86459"). InnerVolumeSpecName "kube-api-access-tfxlq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:42:08.547453 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:08.547391 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfxlq\" (UniqueName: \"kubernetes.io/projected/48eed880-787e-4f3d-b86d-6f0898e86459-kube-api-access-tfxlq\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 08:42:09.024725 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.024692 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" event={"ID":"48eed880-787e-4f3d-b86d-6f0898e86459","Type":"ContainerDied","Data":"3b79c27fb4051ec4be3e385d9087d0706e5dee968c0e7e1caf77501fcc22590c"} Apr 23 08:42:09.024725 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.024723 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b79c27fb4051ec4be3e385d9087d0706e5dee968c0e7e1caf77501fcc22590c" Apr 23 08:42:09.024725 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.024728 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs" Apr 23 08:42:09.905262 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.905230 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw"] Apr 23 08:42:09.905645 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.905543 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48eed880-787e-4f3d-b86d-6f0898e86459" containerName="node" Apr 23 08:42:09.905645 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.905554 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="48eed880-787e-4f3d-b86d-6f0898e86459" containerName="node" Apr 23 08:42:09.905645 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.905610 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="48eed880-787e-4f3d-b86d-6f0898e86459" containerName="node" Apr 23 08:42:09.928604 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.928573 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw"] Apr 23 08:42:09.928759 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.928696 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" Apr 23 08:42:09.932138 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.932032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-w4p9q\"/\"default-dockercfg-s7sp6\"" Apr 23 08:42:09.932138 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.932098 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-w4p9q\"/\"openshift-service-ca.crt\"" Apr 23 08:42:09.932364 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:09.932035 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-w4p9q\"/\"kube-root-ca.crt\"" Apr 23 08:42:10.062051 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:10.062014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khz7w\" (UniqueName: \"kubernetes.io/projected/98b0c410-29f4-42b1-be00-8d89e3efad9b-kube-api-access-khz7w\") pod \"test-trainjob-xl5kw-node-0-0-grfdw\" (UID: \"98b0c410-29f4-42b1-be00-8d89e3efad9b\") " pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" Apr 23 08:42:10.163039 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:10.162961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khz7w\" (UniqueName: \"kubernetes.io/projected/98b0c410-29f4-42b1-be00-8d89e3efad9b-kube-api-access-khz7w\") pod \"test-trainjob-xl5kw-node-0-0-grfdw\" (UID: \"98b0c410-29f4-42b1-be00-8d89e3efad9b\") " pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" Apr 23 08:42:10.171979 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:10.171951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khz7w\" (UniqueName: \"kubernetes.io/projected/98b0c410-29f4-42b1-be00-8d89e3efad9b-kube-api-access-khz7w\") pod \"test-trainjob-xl5kw-node-0-0-grfdw\" (UID: \"98b0c410-29f4-42b1-be00-8d89e3efad9b\") " pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" Apr 23 08:42:10.239348 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:10.239310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" Apr 23 08:42:10.371022 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:10.370993 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw"] Apr 23 08:42:10.374695 ip-10-0-129-53 kubenswrapper[2575]: W0423 08:42:10.374664 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98b0c410_29f4_42b1_be00_8d89e3efad9b.slice/crio-937e09db46e0a6b9d6ad5b856bbf91ad59391acc22c0cde64924680410d7e1c4 WatchSource:0}: Error finding container 937e09db46e0a6b9d6ad5b856bbf91ad59391acc22c0cde64924680410d7e1c4: Status 404 returned error can't find the container with id 937e09db46e0a6b9d6ad5b856bbf91ad59391acc22c0cde64924680410d7e1c4 Apr 23 08:42:10.376632 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:10.376617 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:42:11.032143 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:42:11.032106 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" event={"ID":"98b0c410-29f4-42b1-be00-8d89e3efad9b","Type":"ContainerStarted","Data":"937e09db46e0a6b9d6ad5b856bbf91ad59391acc22c0cde64924680410d7e1c4"} Apr 23 08:48:37.920070 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:48:37.920039 2575 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 23 08:48:37.999952 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:48:37.920097 2575 container_gc.go:86] "Attempting to delete unused containers" Apr 23 08:48:37.999952 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:48:37.921533 2575 scope.go:117] "RemoveContainer" containerID="53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a" Apr 23 08:48:43.677421 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:48:43.677330 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasDiskPressure" Apr 23 08:50:01.663897 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:50:01.663851 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 08:50:01.663897 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:50:01.663904 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 08:50:01.664433 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:50:01.663914 2575 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 08:50:37.923090 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:50:37.923051 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a" Apr 23 08:50:37.923643 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:50:37.923099 2575 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a" Apr 23 08:50:37.923643 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:50:37.923122 2575 scope.go:117] "RemoveContainer" containerID="9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c" Apr 23 08:51:32.230830 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:51:32.230791 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 08:51:32.230830 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:51:32.230829 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 08:51:32.231349 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:51:32.230843 2575 image_gc_manager.go:222] "Failed to monitor images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 08:51:32.234079 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:51:32.234044 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 08:51:32.234079 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:51:32.234081 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 08:51:32.234242 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:51:32.234092 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 08:52:31.664460 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:31.664417 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 23 08:52:31.664460 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:31.664467 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 08:52:31.665050 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:31.664481 2575 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 08:52:37.924118 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:37.924059 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c" Apr 23 08:52:37.924118 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:37.924128 2575 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c" Apr 23 08:52:37.924697 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:37.924151 2575 scope.go:117] "RemoveContainer" containerID="096c7aa34be26732e388e04b3412de984f125158015c3467b93f90ac9339168d" Apr 23 08:52:49.222570 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:49.222539 2575 scope.go:117] "RemoveContainer" containerID="a727bdf8af75129436a726230c7b7cb04bf2934a9e8dda55a2d17733081c0db3" Apr 23 08:52:49.261356 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:49.261336 2575 image_gc_manager.go:447] "Attempting to delete unused images" Apr 23 08:52:49.279164 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:49.279139 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" size=1065600018 runtimeHandler="" Apr 23 08:52:49.279754 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:49.279729 2575 log.go:32] "RemoveImage from image service failed" err="rpc error: code = Unknown desc = delete image: image used by 0225ae1776803da7406836f9e15b695979858f5d09bc3defd724b5ec9089db09: image is in use by a container" image="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" Apr 23 08:52:49.279842 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:49.279771 2575 kuberuntime_image.go:137] "Failed to remove image" err="rpc error: code = Unknown desc = delete image: image used by 0225ae1776803da7406836f9e15b695979858f5d09bc3defd724b5ec9089db09: image is in use by a container" image="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" Apr 23 08:52:49.279842 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:49.279783 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="ac4be6c7a52584c773ae754a4ccfb9fb1db440f4c9d858ad0f78765a85625b4b" size=1065006420 runtimeHandler="" Apr 23 08:52:49.788260 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:49.788225 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="8cfae5f12a3d5e8f5711d1531d223358c13a3d4b36be844d8c6890efdfa09339" size=622989096 runtimeHandler="" Apr 23 08:52:49.850727 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:49.850689 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 23 08:52:49.955919 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:49.955876 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/llvm/bin/clang-19: no space left on device); artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6" Apr 23 08:52:49.956115 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:49.956076 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:node,Image:quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6,Command:[python -c import torch; print(f'PyTorch version: {torch.__version__}'); print('Training completed successfully')],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:29500,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:PET_NNODES,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NPROC_PER_NODE,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NODE_RANK,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PET_MASTER_ADDR,Value:test-trainjob-xl5kw-node-0-0.test-trainjob-xl5kw,ValueFrom:nil,},EnvVar{Name:PET_MASTER_PORT,Value:29500,ValueFrom:nil,},EnvVar{Name:JOB_COMPLETION_INDEX,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-khz7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-trainjob-xl5kw-node-0-0-grfdw_test-ns-w4p9q(98b0c410-29f4-42b1-be00-8d89e3efad9b): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/llvm/bin/clang-19: no space left on device); artifact err: provided artifact is a container image" logger="UnhandledError" Apr 23 08:52:49.957257 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:49.957230 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/llvm/bin/clang-19: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" podUID="98b0c410-29f4-42b1-be00-8d89e3efad9b" Apr 23 08:52:50.143557 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:50.143531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-w4p9q\"/\"default-dockercfg-s7sp6\"" Apr 23 08:52:50.277046 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:50.277018 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-w4p9q\"/\"kube-root-ca.crt\"" Apr 23 08:52:50.287497 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:50.287478 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-w4p9q\"/\"openshift-service-ca.crt\"" Apr 23 08:52:53.728157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:53.727708 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="7e65b8288e37c3f4fac04e8bf51240765caae34795b317d44d5399762a08b761" size=23201654702 runtimeHandler="" Apr 23 08:52:53.728157 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:53.728110 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:52:53.728493 ip-10-0-129-53 kubenswrapper[2575]: E0423 08:52:53.728333 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/llvm/bin/clang-19: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" podUID="98b0c410-29f4-42b1-be00-8d89e3efad9b" Apr 23 08:52:57.652371 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:52:57.652332 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="038c73cf9d35e89709b4f826c0ceb8dc783a3aa366d3139240c1a1da0ec1e546" size=7588072914 runtimeHandler="" Apr 23 08:53:00.926592 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:53:00.926556 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="819e15fdec92d846e6d5de4b1b2988adcb74f6d3046689fe03c655b03a67975d" size=18873458221 runtimeHandler="" Apr 23 08:53:03.838664 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:53:03.838623 2575 eviction_manager.go:473] "Eviction manager: unexpected error when attempting to reduce resource pressure" resourceName="ephemeral-storage" err="wanted to free 9223372036854775807 bytes, but freed 72158053456 bytes space with errors in image deletion: rpc error: code = Unknown desc = delete image: image used by 0225ae1776803da7406836f9e15b695979858f5d09bc3defd724b5ec9089db09: image is in use by a container" Apr 23 08:53:03.845865 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:53:03.845843 2575 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 23 08:53:49.649367 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:53:49.649338 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:56:32.258957 ip-10-0-129-53 kubenswrapper[2575]: I0423 08:56:32.258861 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 09:00:33.871541 ip-10-0-129-53 kubenswrapper[2575]: E0423 09:00:33.871500 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 09:00:33.871541 ip-10-0-129-53 kubenswrapper[2575]: E0423 09:00:33.871546 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 09:00:33.925145 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:00:33.871556 2575 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 09:02:09.893331 ip-10-0-129-53 kubenswrapper[2575]: E0423 09:02:09.893228 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a\": container with ID starting with 53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a not found: ID does not exist" containerID="53654a4173d960af6bdd7a4db3d5e225043115b8d519859325eadb19b4ac824a" Apr 23 09:02:09.993807 ip-10-0-129-53 kubenswrapper[2575]: E0423 09:02:09.993775 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c\": container with ID starting with 9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c not found: ID does not exist" containerID="9c583e7dbbba06df8f4b52d965ab74743e5e4b1aec6a960639e16763453df39c" Apr 23 09:02:10.093883 ip-10-0-129-53 kubenswrapper[2575]: E0423 09:02:10.093852 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096c7aa34be26732e388e04b3412de984f125158015c3467b93f90ac9339168d\": container with ID starting with 096c7aa34be26732e388e04b3412de984f125158015c3467b93f90ac9339168d not found: ID does not exist" containerID="096c7aa34be26732e388e04b3412de984f125158015c3467b93f90ac9339168d" Apr 23 09:02:10.588803 ip-10-0-129-53 kubenswrapper[2575]: E0423 09:02:10.588759 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a727bdf8af75129436a726230c7b7cb04bf2934a9e8dda55a2d17733081c0db3\": container with ID starting with a727bdf8af75129436a726230c7b7cb04bf2934a9e8dda55a2d17733081c0db3 not found: ID does not exist" containerID="a727bdf8af75129436a726230c7b7cb04bf2934a9e8dda55a2d17733081c0db3" Apr 23 09:02:14.842426 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:14.842391 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw"] Apr 23 09:02:14.943931 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:14.943890 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs"] Apr 23 09:02:14.950025 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:14.949988 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-ccmmn/test-trainjob-f822p-node-0-0-b4vcs"] Apr 23 09:02:15.044235 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:15.044191 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj"] Apr 23 09:02:15.045953 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:15.045925 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-j9crz/test-trainjob-zw29c-node-0-0-f5dtj"] Apr 23 09:02:15.220593 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:15.220556 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw"] Apr 23 09:02:15.226231 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:15.226207 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-nfslg/test-trainjob-zxl4c-node-0-0-8z2gw"] Apr 23 09:02:15.869734 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:15.869699 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz"] Apr 23 09:02:15.875578 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:15.875554 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-kzjxw/test-trainjob-cvqh5-node-0-0-f7qpz"] Apr 23 09:02:16.776747 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:16.776710 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48eed880-787e-4f3d-b86d-6f0898e86459" path="/var/lib/kubelet/pods/48eed880-787e-4f3d-b86d-6f0898e86459/volumes" Apr 23 09:02:16.777178 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:16.777162 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="572438f0-32c1-4ad1-8d04-867320107fdf" path="/var/lib/kubelet/pods/572438f0-32c1-4ad1-8d04-867320107fdf/volumes" Apr 23 09:02:16.777591 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:16.777572 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b2eb03-50d8-44a9-a30e-a5d4f08d5d96" path="/var/lib/kubelet/pods/67b2eb03-50d8-44a9-a30e-a5d4f08d5d96/volumes" Apr 23 09:02:16.777975 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:16.777957 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9413768a-0739-4b86-b01e-9e22013bf338" path="/var/lib/kubelet/pods/9413768a-0739-4b86-b01e-9e22013bf338/volumes" Apr 23 09:02:27.177974 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:27.177938 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-rh4m6_a226d354-93fe-46d5-9753-49d3657f216c/manager/0.log" Apr 23 09:02:27.646590 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:27.646554 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-rh4m6_a226d354-93fe-46d5-9753-49d3657f216c/manager/0.log" Apr 23 09:02:28.109790 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:28.109761 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-rh4m6_a226d354-93fe-46d5-9753-49d3657f216c/manager/0.log" Apr 23 09:02:49.903325 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:49.903270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" event={"ID":"98b0c410-29f4-42b1-be00-8d89e3efad9b","Type":"ContainerStarted","Data":"0de3e54ce4405fbe7c54eddec505d50cc271646f0a8680895628dd6ea60539c1"} Apr 23 09:02:49.903325 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:49.903326 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" podUID="98b0c410-29f4-42b1-be00-8d89e3efad9b" containerName="node" containerID="cri-o://0de3e54ce4405fbe7c54eddec505d50cc271646f0a8680895628dd6ea60539c1" gracePeriod=30 Apr 23 09:02:49.921073 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:02:49.921019 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" podStartSLOduration=2.1699124850000002 podStartE2EDuration="20m40.921004088s" podCreationTimestamp="2026-04-23 08:42:09 +0000 UTC" firstStartedPulling="2026-04-23 08:42:10.376746459 +0000 UTC m=+1668.175136905" lastFinishedPulling="2026-04-23 09:02:49.127838063 +0000 UTC m=+2906.926228508" observedRunningTime="2026-04-23 09:02:49.920133069 +0000 UTC m=+2907.718523544" watchObservedRunningTime="2026-04-23 09:02:49.921004088 +0000 UTC m=+2907.719394555" Apr 23 09:03:05.886810 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:05.886772 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmjn8/must-gather-lh65m"] Apr 23 09:03:06.064666 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.064625 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjn8/must-gather-lh65m"] Apr 23 09:03:06.064823 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.064741 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjn8/must-gather-lh65m" Apr 23 09:03:06.067565 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.067542 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjn8\"/\"kube-root-ca.crt\"" Apr 23 09:03:06.067705 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.067581 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjn8\"/\"openshift-service-ca.crt\"" Apr 23 09:03:06.068862 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.068839 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmjn8\"/\"default-dockercfg-pjq8w\"" Apr 23 09:03:06.134441 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.134401 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgt66\" (UniqueName: \"kubernetes.io/projected/19b1f3ab-75cb-419e-90e3-668f96f5a2e7-kube-api-access-wgt66\") pod \"must-gather-lh65m\" (UID: \"19b1f3ab-75cb-419e-90e3-668f96f5a2e7\") " pod="openshift-must-gather-tmjn8/must-gather-lh65m" Apr 23 09:03:06.134441 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.134438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/19b1f3ab-75cb-419e-90e3-668f96f5a2e7-must-gather-output\") pod \"must-gather-lh65m\" (UID: \"19b1f3ab-75cb-419e-90e3-668f96f5a2e7\") " pod="openshift-must-gather-tmjn8/must-gather-lh65m" Apr 23 09:03:06.235623 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.235521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgt66\" (UniqueName: \"kubernetes.io/projected/19b1f3ab-75cb-419e-90e3-668f96f5a2e7-kube-api-access-wgt66\") pod \"must-gather-lh65m\" (UID: \"19b1f3ab-75cb-419e-90e3-668f96f5a2e7\") " pod="openshift-must-gather-tmjn8/must-gather-lh65m" Apr 23 09:03:06.235623 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.235565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/19b1f3ab-75cb-419e-90e3-668f96f5a2e7-must-gather-output\") pod \"must-gather-lh65m\" (UID: \"19b1f3ab-75cb-419e-90e3-668f96f5a2e7\") " pod="openshift-must-gather-tmjn8/must-gather-lh65m" Apr 23 09:03:06.235937 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.235919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/19b1f3ab-75cb-419e-90e3-668f96f5a2e7-must-gather-output\") pod \"must-gather-lh65m\" (UID: \"19b1f3ab-75cb-419e-90e3-668f96f5a2e7\") " pod="openshift-must-gather-tmjn8/must-gather-lh65m" Apr 23 09:03:06.245269 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.245238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgt66\" (UniqueName: \"kubernetes.io/projected/19b1f3ab-75cb-419e-90e3-668f96f5a2e7-kube-api-access-wgt66\") pod \"must-gather-lh65m\" (UID: \"19b1f3ab-75cb-419e-90e3-668f96f5a2e7\") " pod="openshift-must-gather-tmjn8/must-gather-lh65m" Apr 23 09:03:06.374353 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.374289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjn8/must-gather-lh65m" Apr 23 09:03:06.494955 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.494859 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjn8/must-gather-lh65m"] Apr 23 09:03:06.499754 ip-10-0-129-53 kubenswrapper[2575]: W0423 09:03:06.499716 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19b1f3ab_75cb_419e_90e3_668f96f5a2e7.slice/crio-c402fd5db0dad4e27758d16312aedfa204951dd50c8bd4fb05f1c612c04ae059 WatchSource:0}: Error finding container c402fd5db0dad4e27758d16312aedfa204951dd50c8bd4fb05f1c612c04ae059: Status 404 returned error can't find the container with id c402fd5db0dad4e27758d16312aedfa204951dd50c8bd4fb05f1c612c04ae059 Apr 23 09:03:06.501718 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.501701 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:03:06.953584 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:06.953545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/must-gather-lh65m" event={"ID":"19b1f3ab-75cb-419e-90e3-668f96f5a2e7","Type":"ContainerStarted","Data":"c402fd5db0dad4e27758d16312aedfa204951dd50c8bd4fb05f1c612c04ae059"} Apr 23 09:03:07.958232 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:07.958206 2575 generic.go:358] "Generic (PLEG): container finished" podID="98b0c410-29f4-42b1-be00-8d89e3efad9b" containerID="0de3e54ce4405fbe7c54eddec505d50cc271646f0a8680895628dd6ea60539c1" exitCode=0 Apr 23 09:03:07.958603 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:07.958273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" event={"ID":"98b0c410-29f4-42b1-be00-8d89e3efad9b","Type":"ContainerDied","Data":"0de3e54ce4405fbe7c54eddec505d50cc271646f0a8680895628dd6ea60539c1"} Apr 23 09:03:07.958603 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:07.958328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" event={"ID":"98b0c410-29f4-42b1-be00-8d89e3efad9b","Type":"ContainerDied","Data":"937e09db46e0a6b9d6ad5b856bbf91ad59391acc22c0cde64924680410d7e1c4"} Apr 23 09:03:07.958603 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:07.958339 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="937e09db46e0a6b9d6ad5b856bbf91ad59391acc22c0cde64924680410d7e1c4" Apr 23 09:03:07.959426 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:07.959407 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" Apr 23 09:03:07.959659 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:07.959631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/must-gather-lh65m" event={"ID":"19b1f3ab-75cb-419e-90e3-668f96f5a2e7","Type":"ContainerStarted","Data":"5a99bab99f368a79dc582b53adf7aecbe1a714f9aa66d058a4625422d03c5177"} Apr 23 09:03:08.051620 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:08.051587 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khz7w\" (UniqueName: \"kubernetes.io/projected/98b0c410-29f4-42b1-be00-8d89e3efad9b-kube-api-access-khz7w\") pod \"98b0c410-29f4-42b1-be00-8d89e3efad9b\" (UID: \"98b0c410-29f4-42b1-be00-8d89e3efad9b\") " Apr 23 09:03:08.054393 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:08.054348 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98b0c410-29f4-42b1-be00-8d89e3efad9b-kube-api-access-khz7w" (OuterVolumeSpecName: "kube-api-access-khz7w") pod "98b0c410-29f4-42b1-be00-8d89e3efad9b" (UID: "98b0c410-29f4-42b1-be00-8d89e3efad9b"). InnerVolumeSpecName "kube-api-access-khz7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:03:08.152996 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:08.152968 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-khz7w\" (UniqueName: \"kubernetes.io/projected/98b0c410-29f4-42b1-be00-8d89e3efad9b-kube-api-access-khz7w\") on node \"ip-10-0-129-53.ec2.internal\" DevicePath \"\"" Apr 23 09:03:08.963711 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:08.963672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/must-gather-lh65m" event={"ID":"19b1f3ab-75cb-419e-90e3-668f96f5a2e7","Type":"ContainerStarted","Data":"5b8b0447149b407518005b77bbf5cd7944336440b9294150d7bf2be2f9a894dd"} Apr 23 09:03:08.963711 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:08.963721 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw" Apr 23 09:03:08.980124 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:08.980077 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmjn8/must-gather-lh65m" podStartSLOduration=2.707893431 podStartE2EDuration="3.980061988s" podCreationTimestamp="2026-04-23 09:03:05 +0000 UTC" firstStartedPulling="2026-04-23 09:03:06.501834596 +0000 UTC m=+2924.300225045" lastFinishedPulling="2026-04-23 09:03:07.774003156 +0000 UTC m=+2925.572393602" observedRunningTime="2026-04-23 09:03:08.979466699 +0000 UTC m=+2926.777857167" watchObservedRunningTime="2026-04-23 09:03:08.980061988 +0000 UTC m=+2926.778452454" Apr 23 09:03:08.992310 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:08.992234 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw"] Apr 23 09:03:08.996432 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:08.996409 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-w4p9q/test-trainjob-xl5kw-node-0-0-grfdw"] Apr 23 09:03:10.456637 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:10.456601 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9ffdf_a6e9434c-f939-423a-8859-5a047d434c0c/global-pull-secret-syncer/0.log" Apr 23 09:03:10.662447 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:10.662416 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vj7h5_e3ee3b59-dda6-4efc-944c-7d052a7a6c46/konnectivity-agent/0.log" Apr 23 09:03:10.774696 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:10.774616 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-53.ec2.internal_9b0e838abf38c7589ab4ddfd6dec535f/haproxy/0.log" Apr 23 09:03:10.778387 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:10.778360 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98b0c410-29f4-42b1-be00-8d89e3efad9b" path="/var/lib/kubelet/pods/98b0c410-29f4-42b1-be00-8d89e3efad9b/volumes" Apr 23 09:03:14.562981 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.562909 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ncncp_fbb8a1c5-375f-477c-a34a-d075aa60da89/kube-state-metrics/0.log" Apr 23 09:03:14.596075 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.595995 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ncncp_fbb8a1c5-375f-477c-a34a-d075aa60da89/kube-rbac-proxy-main/0.log" Apr 23 09:03:14.635382 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.635277 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ncncp_fbb8a1c5-375f-477c-a34a-d075aa60da89/kube-rbac-proxy-self/0.log" Apr 23 09:03:14.669844 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.669815 2575 log.go:25] "Incomplete line in log file" path="/var/log/pods/openshift-monitoring_metrics-server-989d867f7-8tp78_0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74/metrics-server/0.log" line="2026-04-23T08:52:33.531346525+00:00 stderr F 2026-04-23T" Apr 23 09:03:14.670017 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.669862 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-989d867f7-8tp78_0b7ac758-0bd8-4ac9-9785-cfd4aa3a3e74/metrics-server/0.log" Apr 23 09:03:14.700624 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.700598 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4pntm_3345d53f-22be-4035-8bde-e7cca402f09e/monitoring-plugin/0.log" Apr 23 09:03:14.924131 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.924101 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wdpfr_61cdcd9f-f094-40e3-9ea7-a17d4855004a/node-exporter/0.log" Apr 23 09:03:14.948689 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.948644 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wdpfr_61cdcd9f-f094-40e3-9ea7-a17d4855004a/kube-rbac-proxy/0.log" Apr 23 09:03:14.980686 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:14.980656 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wdpfr_61cdcd9f-f094-40e3-9ea7-a17d4855004a/init-textfile/0.log" Apr 23 09:03:15.012197 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.012133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vgmkr_923b78bc-6002-41ca-97fa-17f5c638dc18/kube-rbac-proxy-main/0.log" Apr 23 09:03:15.047531 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.047500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vgmkr_923b78bc-6002-41ca-97fa-17f5c638dc18/kube-rbac-proxy-self/0.log" Apr 23 09:03:15.091791 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.091764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vgmkr_923b78bc-6002-41ca-97fa-17f5c638dc18/openshift-state-metrics/0.log" Apr 23 09:03:15.151868 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.151836 2575 log.go:25] "Incomplete line in log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/prometheus/0.log" line="2026-04-23T08:52:40.214438840+00:00 stderr" Apr 23 09:03:15.151868 ip-10-0-129-53 kubenswrapper[2575]: E0423 09:03:15.151872 2575 log.go:32] "Failed when parsing line in log file" err="stream type is not found" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/prometheus/0.log" line="2026-04-23T08:52:40.214438840+00:00 stderr" Apr 23 09:03:15.152118 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.151888 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/prometheus/0.log" Apr 23 09:03:15.188425 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.188355 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/config-reloader/0.log" Apr 23 09:03:15.243355 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.243260 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/thanos-sidecar/0.log" Apr 23 09:03:15.277722 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.277692 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/kube-rbac-proxy-web/0.log" Apr 23 09:03:15.306065 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.306042 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/kube-rbac-proxy/0.log" Apr 23 09:03:15.401199 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.401170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/kube-rbac-proxy-thanos/0.log" Apr 23 09:03:15.453680 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.453588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_758f034c-8264-465e-b606-8e656b7d1424/init-config-reloader/0.log" Apr 23 09:03:15.482508 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.482476 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-qn8zr_8379a0e9-e80c-4b7e-85e6-a0bfdf88e222/prometheus-operator/0.log" Apr 23 09:03:15.513591 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:15.513563 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-qn8zr_8379a0e9-e80c-4b7e-85e6-a0bfdf88e222/kube-rbac-proxy/0.log" Apr 23 09:03:17.612468 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.612397 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-n9mmn_ba7b4e72-c90c-416a-9437-1f377ecf8e36/download-server/0.log" Apr 23 09:03:17.791987 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.791948 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78"] Apr 23 09:03:17.792483 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.792455 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98b0c410-29f4-42b1-be00-8d89e3efad9b" containerName="node" Apr 23 09:03:17.792483 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.792482 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b0c410-29f4-42b1-be00-8d89e3efad9b" containerName="node" Apr 23 09:03:17.792667 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.792571 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="98b0c410-29f4-42b1-be00-8d89e3efad9b" containerName="node" Apr 23 09:03:17.854400 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.854360 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78"] Apr 23 09:03:17.854585 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.854521 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:17.945616 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.945580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-sys\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:17.945858 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.945836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-podres\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:17.946050 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.946026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-proc\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:17.946179 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.946063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d2dn\" (UniqueName: \"kubernetes.io/projected/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-kube-api-access-8d2dn\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:17.946179 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:17.946090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-lib-modules\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.046904 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.046872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-proc\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.046904 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.046906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d2dn\" (UniqueName: \"kubernetes.io/projected/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-kube-api-access-8d2dn\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.047094 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.046926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-lib-modules\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.047094 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.046990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-sys\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.047094 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.046999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-proc\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.047094 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.047009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-podres\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.047223 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.047095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-sys\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.047223 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.047112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-podres\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.047223 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.047112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-lib-modules\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.056314 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.056259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d2dn\" (UniqueName: \"kubernetes.io/projected/7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4-kube-api-access-8d2dn\") pod \"perf-node-gather-daemonset-6tw78\" (UID: \"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.167130 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.167039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:18.506470 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.506398 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78"] Apr 23 09:03:18.509807 ip-10-0-129-53 kubenswrapper[2575]: W0423 09:03:18.509778 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7cfe4140_e16f_4d9b_91cf_16cfa3cc83a4.slice/crio-5f70339df74406f8bf94368245df9ac3680ff806f16c019a2b19eb7af4acef12 WatchSource:0}: Error finding container 5f70339df74406f8bf94368245df9ac3680ff806f16c019a2b19eb7af4acef12: Status 404 returned error can't find the container with id 5f70339df74406f8bf94368245df9ac3680ff806f16c019a2b19eb7af4acef12 Apr 23 09:03:18.862137 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.862111 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5r722_8a62b026-adde-4674-a052-cc9aa72e0a2a/dns/0.log" Apr 23 09:03:18.891231 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.891199 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5r722_8a62b026-adde-4674-a052-cc9aa72e0a2a/kube-rbac-proxy/0.log" Apr 23 09:03:18.957900 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:18.957869 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sx7f6_f798ddd1-abe6-46fe-8f87-51eb8f211cba/dns-node-resolver/0.log" Apr 23 09:03:19.014913 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:19.014879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" event={"ID":"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4","Type":"ContainerStarted","Data":"cc0bc34349c12d7c2457189f2319c48a60f020dc7dbabb9d971162d44b70f54a"} Apr 23 09:03:19.014913 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:19.014919 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" event={"ID":"7cfe4140-e16f-4d9b-91cf-16cfa3cc83a4","Type":"ContainerStarted","Data":"5f70339df74406f8bf94368245df9ac3680ff806f16c019a2b19eb7af4acef12"} Apr 23 09:03:19.015110 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:19.014975 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:19.034006 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:19.033961 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" podStartSLOduration=2.03394731 podStartE2EDuration="2.03394731s" podCreationTimestamp="2026-04-23 09:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:03:19.033816403 +0000 UTC m=+2936.832206873" watchObservedRunningTime="2026-04-23 09:03:19.03394731 +0000 UTC m=+2936.832337776" Apr 23 09:03:19.435337 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:19.435288 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6c5f794bbc-jk7t7_51bf1211-8340-4a64-97b4-f7f8f0e8eb17/registry/0.log" Apr 23 09:03:19.519266 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:19.519215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ht997_ccf4efd3-2881-48aa-80c7-44d62f243db8/node-ca/0.log" Apr 23 09:03:20.271404 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:20.271370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-578458b7fb-b9s92_fc24cfd5-bd05-46ce-9716-b6dcf55769b2/router/0.log" Apr 23 09:03:20.667156 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:20.667125 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z4nr6_5c18ff90-80b0-4fd9-a24d-ee6d39d729b7/serve-healthcheck-canary/0.log" Apr 23 09:03:21.242338 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:21.242306 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v6phf_7a73e05a-b51d-4618-8f88-bc4fcbe4fd54/kube-rbac-proxy/0.log" Apr 23 09:03:21.266223 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:21.266201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v6phf_7a73e05a-b51d-4618-8f88-bc4fcbe4fd54/exporter/0.log" Apr 23 09:03:21.293429 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:21.293394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v6phf_7a73e05a-b51d-4618-8f88-bc4fcbe4fd54/extractor/0.log" Apr 23 09:03:25.027230 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:25.027203 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-6tw78" Apr 23 09:03:27.850947 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:27.850921 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx29n_00bd34ed-f0ff-40f8-bd12-a9c364e0fe82/kube-multus-additional-cni-plugins/0.log" Apr 23 09:03:27.900002 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:27.899974 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx29n_00bd34ed-f0ff-40f8-bd12-a9c364e0fe82/egress-router-binary-copy/0.log" Apr 23 09:03:27.945171 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:27.945129 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx29n_00bd34ed-f0ff-40f8-bd12-a9c364e0fe82/cni-plugins/0.log" Apr 23 09:03:27.994399 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:27.994372 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx29n_00bd34ed-f0ff-40f8-bd12-a9c364e0fe82/bond-cni-plugin/0.log" Apr 23 09:03:28.051534 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:28.051465 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx29n_00bd34ed-f0ff-40f8-bd12-a9c364e0fe82/routeoverride-cni/0.log" Apr 23 09:03:28.100197 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:28.100154 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx29n_00bd34ed-f0ff-40f8-bd12-a9c364e0fe82/whereabouts-cni-bincopy/0.log" Apr 23 09:03:28.135634 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:28.135597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx29n_00bd34ed-f0ff-40f8-bd12-a9c364e0fe82/whereabouts-cni/0.log" Apr 23 09:03:28.367933 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:28.367879 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dhsc2_b040571c-3a7e-407f-8b6a-b70862f5b8c0/kube-multus/0.log" Apr 23 09:03:28.509812 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:28.509784 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gtsb8_75af54d6-d4ae-4e8e-bf63-80cc7a54fe63/network-metrics-daemon/0.log" Apr 23 09:03:28.534353 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:28.534327 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gtsb8_75af54d6-d4ae-4e8e-bf63-80cc7a54fe63/kube-rbac-proxy/0.log" Apr 23 09:03:30.025475 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:30.025439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xlcv7_7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31/ovn-controller/0.log" Apr 23 09:03:30.076981 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:30.076950 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xlcv7_7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31/ovn-acl-logging/0.log" Apr 23 09:03:30.104585 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:30.104557 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xlcv7_7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31/kube-rbac-proxy-node/0.log" Apr 23 09:03:30.142193 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:30.142168 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xlcv7_7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:03:30.198366 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:30.198230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xlcv7_7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31/northd/0.log" Apr 23 09:03:30.224113 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:30.224086 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xlcv7_7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31/nbdb/0.log" Apr 23 09:03:30.249151 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:30.249129 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xlcv7_7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31/sbdb/0.log" Apr 23 09:03:30.484478 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:30.484379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xlcv7_7a42bc96-af2a-4e37-9f0b-3ef91d9e9e31/ovnkube-controller/0.log" Apr 23 09:03:31.603016 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:31.602990 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-b4f6z_b566e32e-9149-4af2-a4dd-4dba40f00efa/network-check-target-container/0.log" Apr 23 09:03:32.649084 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:32.649055 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rtkxh_908435e9-5714-4471-be36-df61c28816d3/iptables-alerter/0.log" Apr 23 09:03:33.327095 ip-10-0-129-53 kubenswrapper[2575]: I0423 09:03:33.327060 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vrk6z_b2d80946-cce7-4f66-bd36-9b3413670c78/tuned/0.log"