Apr 16 13:56:31.589691 ip-10-0-140-59 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:31.589704 ip-10-0-140-59 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:31.589713 ip-10-0-140-59 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:31.590011 ip-10-0-140-59 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:56:41.830366 ip-10-0-140-59 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:56:41.830381 ip-10-0-140-59 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 5ff79378c50f46519e52b45d65c38043 -- Apr 16 13:58:54.734769 ip-10-0-140-59 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:58:55.216829 ip-10-0-140-59 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:58:55.216829 ip-10-0-140-59 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:58:55.216829 ip-10-0-140-59 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:58:55.216829 ip-10-0-140-59 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:58:55.216829 ip-10-0-140-59 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:58:55.218017 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.217848 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:58:55.223817 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223799 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:55.223817 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223816 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223820 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223824 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223828 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223831 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223834 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223836 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223839 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223842 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223845 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223848 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223851 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223854 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223857 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223860 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223862 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223865 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223868 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223877 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223880 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:55.223896 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223883 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223885 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223888 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223890 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223893 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223895 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223899 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223902 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223905 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223907 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223910 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223914 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223916 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223919 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223921 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223924 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223927 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223929 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223932 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223934 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:55.224384 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223938 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223941 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223943 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223946 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223949 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223951 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223954 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223956 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223959 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223961 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223966 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223970 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223973 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223977 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223981 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223984 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223987 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223990 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223992 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:55.224918 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223995 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.223998 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224000 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224003 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224005 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224008 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224012 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224014 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224016 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224019 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224023 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224026 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224028 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224031 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224033 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224036 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224038 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224041 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224043 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224046 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:55.225411 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224048 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224051 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224054 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224057 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224059 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224063 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224484 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224491 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224495 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224498 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224501 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224504 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224507 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224509 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224512 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224515 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224517 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224520 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224523 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224526 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:55.225905 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224529 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224532 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224535 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224537 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224540 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224544 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224547 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224550 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224553 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224556 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224559 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224561 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224564 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224567 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224569 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224572 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224574 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224577 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224580 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224584 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:55.226416 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224586 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224589 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224591 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224594 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224597 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224599 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224602 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224604 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224607 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224610 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224612 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224615 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224618 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224620 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224622 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224625 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224628 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224630 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224633 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224635 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:55.227023 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224638 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224640 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224643 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224645 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224648 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224650 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224653 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224655 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224658 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224661 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224665 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224668 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224671 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224673 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224676 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224679 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224682 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224684 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224687 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:55.227557 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224690 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224692 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224696 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224698 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224700 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224703 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224706 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224710 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224713 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224716 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224718 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224721 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.224723 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224802 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224809 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224818 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224825 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224831 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224837 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224843 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224849 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:58:55.228016 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224852 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224855 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224860 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224863 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224866 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224869 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224872 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224875 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224878 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224881 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224884 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224890 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224893 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224896 2572 flags.go:64] FLAG: --config-dir="" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224899 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224902 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224907 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224910 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224914 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224917 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224920 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224923 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224926 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224930 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224933 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:58:55.228537 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224938 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224941 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224944 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224947 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224950 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224953 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224958 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224962 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224965 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224968 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224971 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224975 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224978 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224981 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224984 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224987 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224990 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224993 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224996 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.224998 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225001 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225004 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225008 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225010 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225014 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:58:55.229179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225018 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225021 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225024 2572 flags.go:64] FLAG: --help="false" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225027 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225030 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225033 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225036 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225039 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225043 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225046 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225049 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225052 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225055 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225058 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225063 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225066 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225069 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225072 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225075 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225078 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225081 2572 flags.go:64] FLAG: --lock-file="" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225083 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225086 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225089 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:58:55.229817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225095 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225098 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225100 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225103 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225106 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225110 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225112 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225115 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225120 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225124 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225129 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225132 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225135 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225137 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225141 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225144 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225146 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225150 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225157 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225160 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225163 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225167 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225174 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:58:55.230395 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225180 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225183 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225186 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225189 2572 flags.go:64] FLAG: --port="10250" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225192 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225196 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-032350dc572a4d531" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225199 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225202 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225205 2572 flags.go:64] FLAG: --register-node="true" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225208 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225211 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225215 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225218 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225220 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225223 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225227 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225230 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225233 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225236 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225240 2572 flags.go:64] FLAG: --runonce="false" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225242 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225245 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225248 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225251 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225254 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225257 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:58:55.230975 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225260 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225264 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225267 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225270 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225273 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225285 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225289 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225292 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225295 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225300 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225303 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225306 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225311 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225314 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225317 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225320 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225323 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225326 2572 flags.go:64] FLAG: --v="2" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225330 2572 flags.go:64] FLAG: --version="false" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225334 2572 flags.go:64] FLAG: --vmodule="" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225338 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.225342 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225438 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225442 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:55.231634 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225445 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225447 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225464 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225467 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225470 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225473 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225477 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225479 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225482 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225488 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225491 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225493 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225496 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225500 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225503 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225506 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225508 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225511 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225514 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225516 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:55.232222 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225519 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225522 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225525 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225527 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225530 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225532 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225535 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225538 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225540 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225543 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225545 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225548 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225550 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225553 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225555 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225559 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225562 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225564 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225567 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225570 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:55.232769 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225572 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225576 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225578 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225581 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225584 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225587 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225590 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225592 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225595 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225597 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225600 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225603 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225605 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225608 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225610 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225613 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225616 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225618 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225621 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:55.233260 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225623 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225626 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225628 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225631 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225633 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225636 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225638 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225641 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225644 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225647 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225649 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225652 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225655 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225657 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225663 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225666 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225669 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225672 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225676 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:55.233743 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225678 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:55.234221 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225682 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:55.234221 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225686 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:55.234221 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225689 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:55.234221 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225692 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:55.234221 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.225695 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:55.234221 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.226471 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:58:55.234380 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.234323 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:58:55.234380 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.234339 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234388 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234393 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234396 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234399 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234402 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234405 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234408 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234410 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234413 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234416 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234419 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234422 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234424 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234429 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234432 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234435 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234438 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:55.234432 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234441 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234444 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234446 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234449 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234468 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234471 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234473 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234476 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234479 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234481 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234484 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234486 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234489 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234492 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234495 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234497 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234500 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234503 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234505 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:55.234891 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234509 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234513 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234515 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234518 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234520 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234524 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234526 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234529 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234532 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234535 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234538 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234540 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234543 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234546 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234549 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234552 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234554 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234557 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234560 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:55.235365 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234562 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234565 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234567 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234570 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234573 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234575 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234578 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234582 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234586 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234589 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234592 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234595 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234598 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234601 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234603 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234606 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234609 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234612 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234614 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234617 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:55.235843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234619 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234622 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234625 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234627 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234631 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234634 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234636 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234639 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234643 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234645 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234648 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.234653 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234765 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234769 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234773 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:55.236373 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234776 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234778 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234781 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234784 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234787 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234790 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234793 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234796 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234799 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234801 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234804 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234807 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234809 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234812 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234814 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234817 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234819 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234822 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234825 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234828 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:55.236840 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234830 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234833 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234836 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234838 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234841 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234844 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234847 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234849 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234852 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234856 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234860 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234863 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234865 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234868 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234871 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234874 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234876 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234879 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234881 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234884 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:55.237328 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234886 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234889 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234891 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234894 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234897 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234900 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234903 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234906 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234908 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234910 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234914 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234916 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234919 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234921 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234924 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234927 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234930 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234933 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234935 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234938 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:55.237823 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234940 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234943 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234947 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234951 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234954 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234957 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234960 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234963 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234965 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234969 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234971 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234974 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234977 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234979 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234982 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234985 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234987 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234990 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234993 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:55.238302 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234995 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:55.238773 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.234998 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:55.238773 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.235000 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:55.238773 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:55.235003 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:55.238773 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.235008 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:58:55.238773 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.235842 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:58:55.239278 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.239264 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:58:55.240385 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.240373 2572 server.go:1019] "Starting client certificate rotation" Apr 16 13:58:55.240509 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.240489 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:58:55.241141 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.241130 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:58:55.268488 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.268469 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:58:55.272035 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.272020 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:58:55.287871 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.287841 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:58:55.294477 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.294443 2572 log.go:25] "Validated CRI v1 image API" Apr 16 13:58:55.294597 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.294579 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:58:55.296001 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.295981 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:58:55.301549 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.301518 2572 fs.go:135] Filesystem UUIDs: map[3eb6bf84-a954-4219-a317-f527608cf887:/dev/nvme0n1p4 788077ff-d4d7-426a-a9c7-f94023177df7:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 13:58:55.301659 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.301549 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:58:55.307912 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.307783 2572 manager.go:217] Machine: {Timestamp:2026-04-16 13:58:55.305599812 +0000 UTC m=+0.461934843 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101308 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fcfc64bf32f2d12abf6cdb6311da4 SystemUUID:ec2fcfc6-4bf3-2f2d-12ab-f6cdb6311da4 BootID:5ff79378-c50f-4651-9e52-b45d65c38043 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:03:54:f0:d0:91 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:03:54:f0:d0:91 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:76:a5:aa:51:e6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:58:55.307912 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.307898 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:58:55.308066 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.308016 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:58:55.309341 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.309309 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:58:55.309539 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.309342 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-59.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:58:55.309618 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.309553 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:58:55.309618 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.309565 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:58:55.309618 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.309583 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:58:55.309618 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.309610 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:58:55.310558 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.310538 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-772f8" Apr 16 13:58:55.311665 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.311650 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:58:55.311811 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.311800 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:58:55.314661 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.314649 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:58:55.314728 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.314675 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:58:55.314728 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.314693 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:58:55.314728 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.314708 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:58:55.314846 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.314736 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:58:55.315936 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.315923 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:58:55.315999 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.315958 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:58:55.317681 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.317662 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-772f8" Apr 16 13:58:55.319441 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.319420 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:58:55.323932 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.323895 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:58:55.325697 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325680 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:58:55.325697 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325701 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325711 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325718 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325724 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325731 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325737 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325742 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325750 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325757 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325765 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:58:55.325851 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.325775 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:58:55.326922 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.326905 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:58:55.326922 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.326919 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:58:55.329679 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.329658 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:55.331791 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.331778 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:58:55.331869 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.331817 2572 server.go:1295] "Started kubelet" Apr 16 13:58:55.331940 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.331815 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:55.331987 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.331930 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:58:55.332032 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.331912 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:58:55.332032 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.332026 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:58:55.332553 ip-10-0-140-59 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:58:55.333505 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.333490 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:58:55.334893 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.334877 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:58:55.335598 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.335581 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-59.ec2.internal" not found Apr 16 13:58:55.341307 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.341290 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:58:55.341402 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.341309 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:58:55.341402 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:55.341335 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:58:55.342063 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342046 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:58:55.342063 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342065 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:58:55.342157 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342098 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:58:55.342157 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342114 2572 factory.go:55] Registering systemd factory Apr 16 13:58:55.342157 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342123 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:58:55.342157 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342046 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:58:55.342306 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342185 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:58:55.342306 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342193 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:58:55.342306 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:55.342200 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-59.ec2.internal\" not found" Apr 16 13:58:55.342416 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342361 2572 factory.go:153] Registering CRI-O factory Apr 16 13:58:55.342416 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342374 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 13:58:55.342416 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342399 2572 factory.go:103] Registering Raw factory Apr 16 13:58:55.342416 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342412 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 13:58:55.342987 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.342975 2572 manager.go:319] Starting recovery of all containers Apr 16 13:58:55.343621 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.343595 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:55.346640 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:55.346614 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-59.ec2.internal\" not found" node="ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.351473 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.351434 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-59.ec2.internal" not found Apr 16 13:58:55.356095 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.355934 2572 manager.go:324] Recovery completed Apr 16 13:58:55.360475 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.360396 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:58:55.362557 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.362541 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-59.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:58:55.362630 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.362572 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:58:55.362630 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.362586 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-59.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:58:55.363074 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.363061 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:58:55.363123 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.363075 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:58:55.363123 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.363092 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:58:55.366930 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.366916 2572 policy_none.go:49] "None policy: Start" Apr 16 13:58:55.366992 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.366933 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:58:55.366992 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.366944 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.407942 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-59.ec2.internal" not found Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.408091 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:55.408215 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.408230 2572 server.go:85] "Starting device plugin registration server" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.408479 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.408497 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.408598 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.408709 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.408718 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:55.410680 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:58:55.413868 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:55.410725 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-59.ec2.internal\" not found" Apr 16 13:58:55.461540 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.461499 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:58:55.462743 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.462729 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:58:55.462830 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.462758 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:58:55.462830 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.462792 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:58:55.462830 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.462802 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:58:55.462958 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:55.462843 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:58:55.465550 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.465529 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:55.509629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.509564 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:58:55.510786 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.510770 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-59.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:58:55.510847 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.510799 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:58:55.510847 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.510810 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-59.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:58:55.510847 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.510837 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.520069 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.520047 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.520163 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:55.520074 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-59.ec2.internal\": node \"ip-10-0-140-59.ec2.internal\" not found" Apr 16 13:58:55.563215 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.563182 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal"] Apr 16 13:58:55.565897 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.565877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.568684 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.568665 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.587103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.587085 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.590631 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.590616 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.598857 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.598843 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:58:55.601239 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.601222 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:58:55.643531 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.643500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e83a86637d931638418a028ca5b217de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal\" (UID: \"e83a86637d931638418a028ca5b217de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.643531 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.643532 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e83a86637d931638418a028ca5b217de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal\" (UID: \"e83a86637d931638418a028ca5b217de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.643703 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.643549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1aecfc38a711305ef53a73c57dbb3d6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-59.ec2.internal\" (UID: \"b1aecfc38a711305ef53a73c57dbb3d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.744223 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.744196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1aecfc38a711305ef53a73c57dbb3d6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-59.ec2.internal\" (UID: \"b1aecfc38a711305ef53a73c57dbb3d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.744223 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.744226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e83a86637d931638418a028ca5b217de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal\" (UID: \"e83a86637d931638418a028ca5b217de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.744370 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.744245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e83a86637d931638418a028ca5b217de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal\" (UID: \"e83a86637d931638418a028ca5b217de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.744370 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.744299 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1aecfc38a711305ef53a73c57dbb3d6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-59.ec2.internal\" (UID: \"b1aecfc38a711305ef53a73c57dbb3d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.744370 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.744363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e83a86637d931638418a028ca5b217de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal\" (UID: \"e83a86637d931638418a028ca5b217de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.744477 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.744388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e83a86637d931638418a028ca5b217de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal\" (UID: \"e83a86637d931638418a028ca5b217de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.903084 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.902975 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" Apr 16 13:58:55.904044 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:55.904025 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" Apr 16 13:58:56.240951 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.240861 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:58:56.241510 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.241043 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:58:56.241510 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.241077 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:58:56.241510 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.241089 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:58:56.315633 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.315587 2572 apiserver.go:52] "Watching apiserver" Apr 16 13:58:56.319802 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.319751 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:53:55 +0000 UTC" deadline="2027-12-22 00:24:13.855881859 +0000 UTC" Apr 16 13:58:56.319856 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.319804 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14746h25m17.536081743s" Apr 16 13:58:56.324659 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.324635 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:58:56.324989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.324967 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ppbcs","kube-system/konnectivity-agent-g9npf","openshift-dns/node-resolver-lnsdn","openshift-image-registry/node-ca-5p4kh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal","openshift-network-operator/iptables-alerter-kpzwx","kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj","openshift-cluster-node-tuning-operator/tuned-5rc24","openshift-multus/multus-additional-cni-plugins-bb72s","openshift-multus/multus-xv4ws","openshift-multus/network-metrics-daemon-6bp8d","openshift-network-diagnostics/network-check-target-bct9b"] Apr 16 13:58:56.327838 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.327816 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.329949 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.329923 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:58:56.330104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.329958 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:58:56.330104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.330055 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:58:56.330104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.330059 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:58:56.330303 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.330259 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:58:56.330411 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.330399 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:58:56.330735 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.330722 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:58:56.330802 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.330739 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jp2kq\"" Apr 16 13:58:56.331788 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.331769 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bpxb7\"" Apr 16 13:58:56.331907 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.331885 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:58:56.331990 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.331909 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:58:56.332233 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.332217 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.333904 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.333891 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9rq89\"" Apr 16 13:58:56.334000 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.333985 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:58:56.334046 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.334003 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:58:56.334507 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.334495 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.336400 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.336386 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:58:56.336493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.336479 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:58:56.336684 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.336671 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:58:56.336731 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.336709 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.336845 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.336828 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6gpk8\"" Apr 16 13:58:56.338437 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.338425 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:58:56.338495 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.338469 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:58:56.338707 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.338695 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pwlhx\"" Apr 16 13:58:56.338749 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.338737 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:58:56.339005 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.338989 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.340894 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.340876 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:58:56.340976 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.340943 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gcj25\"" Apr 16 13:58:56.341026 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.340982 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:58:56.341061 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.341038 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:58:56.341221 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.341192 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.341386 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.341369 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:58:56.343163 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.343143 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:58:56.343266 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.343164 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-76pmw\"" Apr 16 13:58:56.343266 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.343208 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:58:56.343617 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.343602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.346503 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.346651 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hwxl9\"" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.346666 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.346830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rvl9\" (UniqueName: \"kubernetes.io/projected/d63d695c-f063-4981-a993-07dd8b11f193-kube-api-access-5rvl9\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.346869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-device-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.346928 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.346958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-node-log\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.346989 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347009 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovnkube-config\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e11c6740-8d55-4673-8d82-f90f6a93b413-host\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkqjv\" (UniqueName: \"kubernetes.io/projected/e11c6740-8d55-4673-8d82-f90f6a93b413-kube-api-access-hkqjv\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-system-cni-dir\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347260 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-run-netns\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347322 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8kxg\" (UniqueName: \"kubernetes.io/projected/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-kube-api-access-s8kxg\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-lib-modules\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d63d695c-f063-4981-a993-07dd8b11f193-etc-tuned\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-slash\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.347493 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347447 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-etc-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-hosts-file\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347647 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-run\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347926 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysctl-conf\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-host\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.347982 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdq5g\" (UniqueName: \"kubernetes.io/projected/8725739e-0bdd-4d97-b43d-4551f43cd997-kube-api-access-tdq5g\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348074 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-kubelet\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e11c6740-8d55-4673-8d82-f90f6a93b413-serviceca\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01b92520-2c04-454e-8a4a-c542e9075e22-host-slash\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-systemd-units\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d63d695c-f063-4981-a993-07dd8b11f193-tmp\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2nb\" (UniqueName: \"kubernetes.io/projected/40fcc100-8a15-40b9-a4d8-8c9913394f91-kube-api-access-rg2nb\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348306 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-sys-fs\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.348469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-kubernetes\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.349103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.349103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01b92520-2c04-454e-8a4a-c542e9075e22-iptables-alerter-script\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.349103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.349103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-systemd\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.349103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348647 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.349103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.348959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.349103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-cni-netd\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.349379 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349115 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-env-overrides\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.349379 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovn-node-metrics-cert\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.349645 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-os-release\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.349704 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-ovn\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.349755 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-registration-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.349755 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysconfig\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.349867 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-cni-binary-copy\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.349867 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-cni-bin\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.349867 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59f23d5d-a915-4636-b347-7d14ae37dbed-konnectivity-ca\") pod \"konnectivity-agent-g9npf\" (UID: \"59f23d5d-a915-4636-b347-7d14ae37dbed\") " pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:58:56.349867 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349832 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysctl-d\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.349867 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-var-lib-kubelet\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.350071 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349879 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-modprobe-d\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.350071 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbgv4\" (UniqueName: \"kubernetes.io/projected/01b92520-2c04-454e-8a4a-c542e9075e22-kube-api-access-lbgv4\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.350071 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.349981 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-systemd\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.350071 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59f23d5d-a915-4636-b347-7d14ae37dbed-agent-certs\") pod \"konnectivity-agent-g9npf\" (UID: \"59f23d5d-a915-4636-b347-7d14ae37dbed\") " pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:58:56.350071 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350058 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-var-lib-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.350305 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovnkube-script-lib\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.350305 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.350305 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-sys\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.350305 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh7s5\" (UniqueName: \"kubernetes.io/projected/eee60be0-add8-410b-982e-1aa1f11ec111-kube-api-access-lh7s5\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.350305 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-log-socket\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.350305 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-tmp-dir\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.350305 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350266 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-socket-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.350305 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-cnibin\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.350594 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350387 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:56.350594 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350439 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jb5ss\"" Apr 16 13:58:56.350594 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:56.350441 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:58:56.350849 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.350835 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:58:56.352681 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.352666 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:58:56.352744 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:56.352712 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:58:56.356872 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.356857 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:58:56.375131 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.375113 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-z7kdq" Apr 16 13:58:56.380287 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.380270 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-z7kdq" Apr 16 13:58:56.443247 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.443227 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:58:56.450557 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdq5g\" (UniqueName: \"kubernetes.io/projected/8725739e-0bdd-4d97-b43d-4551f43cd997-kube-api-access-tdq5g\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.450643 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.450643 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-kubelet\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.450708 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e11c6740-8d55-4673-8d82-f90f6a93b413-serviceca\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.450708 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-cni-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.450766 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01b92520-2c04-454e-8a4a-c542e9075e22-host-slash\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.450766 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450741 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-netns\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.450766 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdlw8\" (UniqueName: \"kubernetes.io/projected/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-kube-api-access-bdlw8\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.450845 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-systemd-units\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.450876 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01b92520-2c04-454e-8a4a-c542e9075e22-host-slash\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.450876 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450822 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-multus-certs\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.450934 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d63d695c-f063-4981-a993-07dd8b11f193-tmp\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.450970 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2nb\" (UniqueName: \"kubernetes.io/projected/40fcc100-8a15-40b9-a4d8-8c9913394f91-kube-api-access-rg2nb\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.451018 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.450954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-systemd-units\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.451079 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-sys-fs\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.451134 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451038 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-kubelet\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.451134 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-kubernetes\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.451243 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-kubernetes\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.451243 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-sys-fs\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.451243 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.451380 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01b92520-2c04-454e-8a4a-c542e9075e22-iptables-alerter-script\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.451380 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.451380 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-systemd\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.451544 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451411 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.451544 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-cni-netd\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.451544 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-env-overrides\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.451694 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.451694 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovn-node-metrics-cert\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.451694 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451551 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:58:56.451694 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.451694 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451678 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e11c6740-8d55-4673-8d82-f90f6a93b413-serviceca\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.451927 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-cni-netd\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.451927 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-systemd\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.451927 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-system-cni-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.451927 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-os-release\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.452091 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-ovn\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.452091 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.451941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.452091 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rpkz\" (UniqueName: \"kubernetes.io/projected/91ffb15b-8d84-4a65-a157-65c7adaca0ea-kube-api-access-5rpkz\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:56.452091 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452059 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-os-release\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.452091 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452086 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-cni-multus\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.452268 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452111 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-os-release\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.452268 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-registration-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.452268 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452205 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-registration-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.452268 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.452268 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452226 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-ovn\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.452268 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysconfig\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.452268 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452242 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01b92520-2c04-454e-8a4a-c542e9075e22-iptables-alerter-script\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.452268 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysconfig\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-cni-binary-copy\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-cni-bin\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59f23d5d-a915-4636-b347-7d14ae37dbed-konnectivity-ca\") pod \"konnectivity-agent-g9npf\" (UID: \"59f23d5d-a915-4636-b347-7d14ae37dbed\") " pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-env-overrides\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-cni-bin\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-conf-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysctl-d\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-var-lib-kubelet\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-modprobe-d\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-var-lib-kubelet\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452557 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbgv4\" (UniqueName: \"kubernetes.io/projected/01b92520-2c04-454e-8a4a-c542e9075e22-kube-api-access-lbgv4\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-systemd\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysctl-d\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.452629 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59f23d5d-a915-4636-b347-7d14ae37dbed-agent-certs\") pod \"konnectivity-agent-g9npf\" (UID: \"59f23d5d-a915-4636-b347-7d14ae37dbed\") " pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-daemon-config\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-var-lib-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-systemd\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovnkube-script-lib\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-kubelet\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-sys\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh7s5\" (UniqueName: \"kubernetes.io/projected/eee60be0-add8-410b-982e-1aa1f11ec111-kube-api-access-lh7s5\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452865 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59f23d5d-a915-4636-b347-7d14ae37dbed-konnectivity-ca\") pod \"konnectivity-agent-g9npf\" (UID: \"59f23d5d-a915-4636-b347-7d14ae37dbed\") " pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-log-socket\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-tmp-dir\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eee60be0-add8-410b-982e-1aa1f11ec111-cni-binary-copy\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452975 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-cnibin\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-log-socket\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453024 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-sys\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.453258 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.452642 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-modprobe-d\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453143 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-var-lib-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-k8s-cni-cncf-io\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-socket-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-cnibin\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453307 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovnkube-script-lib\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453334 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-socket-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-tmp-dir\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-cnibin\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rvl9\" (UniqueName: \"kubernetes.io/projected/d63d695c-f063-4981-a993-07dd8b11f193-kube-api-access-5rvl9\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-device-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8725739e-0bdd-4d97-b43d-4551f43cd997-device-dir\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-node-log\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-node-log\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovnkube-config\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454020 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453589 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e11c6740-8d55-4673-8d82-f90f6a93b413-host\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkqjv\" (UniqueName: \"kubernetes.io/projected/e11c6740-8d55-4673-8d82-f90f6a93b413-kube-api-access-hkqjv\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453738 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e11c6740-8d55-4673-8d82-f90f6a93b413-host\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-etc-kubernetes\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-system-cni-dir\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-run-netns\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8kxg\" (UniqueName: \"kubernetes.io/projected/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-kube-api-access-s8kxg\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-lib-modules\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d63d695c-f063-4981-a993-07dd8b11f193-etc-tuned\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-slash\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453913 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-run-netns\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.453949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-etc-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovnkube-config\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454087 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-slash\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eee60be0-add8-410b-982e-1aa1f11ec111-system-cni-dir\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.454834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-etc-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-lib-modules\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-hosts-file\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454793 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-run\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.454995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-hosts-file\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-run\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455142 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40fcc100-8a15-40b9-a4d8-8c9913394f91-run-openvswitch\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d63d695c-f063-4981-a993-07dd8b11f193-tmp\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-cni-binary-copy\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40fcc100-8a15-40b9-a4d8-8c9913394f91-ovn-node-metrics-cert\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-socket-dir-parent\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-cni-bin\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455437 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-hostroot\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysctl-conf\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.455610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455563 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-host\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.456372 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-etc-sysctl-conf\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.456372 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.455810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63d695c-f063-4981-a993-07dd8b11f193-host\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.456372 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.456045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59f23d5d-a915-4636-b347-7d14ae37dbed-agent-certs\") pod \"konnectivity-agent-g9npf\" (UID: \"59f23d5d-a915-4636-b347-7d14ae37dbed\") " pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:58:56.456372 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.456363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d63d695c-f063-4981-a993-07dd8b11f193-etc-tuned\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.458748 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.458723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdq5g\" (UniqueName: \"kubernetes.io/projected/8725739e-0bdd-4d97-b43d-4551f43cd997-kube-api-access-tdq5g\") pod \"aws-ebs-csi-driver-node-5cqlj\" (UID: \"8725739e-0bdd-4d97-b43d-4551f43cd997\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.459317 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.459296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2nb\" (UniqueName: \"kubernetes.io/projected/40fcc100-8a15-40b9-a4d8-8c9913394f91-kube-api-access-rg2nb\") pod \"ovnkube-node-ppbcs\" (UID: \"40fcc100-8a15-40b9-a4d8-8c9913394f91\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.459565 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.459543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbgv4\" (UniqueName: \"kubernetes.io/projected/01b92520-2c04-454e-8a4a-c542e9075e22-kube-api-access-lbgv4\") pod \"iptables-alerter-kpzwx\" (UID: \"01b92520-2c04-454e-8a4a-c542e9075e22\") " pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.460930 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.460906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh7s5\" (UniqueName: \"kubernetes.io/projected/eee60be0-add8-410b-982e-1aa1f11ec111-kube-api-access-lh7s5\") pod \"multus-additional-cni-plugins-bb72s\" (UID: \"eee60be0-add8-410b-982e-1aa1f11ec111\") " pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.461282 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.461252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkqjv\" (UniqueName: \"kubernetes.io/projected/e11c6740-8d55-4673-8d82-f90f6a93b413-kube-api-access-hkqjv\") pod \"node-ca-5p4kh\" (UID: \"e11c6740-8d55-4673-8d82-f90f6a93b413\") " pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.461423 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.461390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rvl9\" (UniqueName: \"kubernetes.io/projected/d63d695c-f063-4981-a993-07dd8b11f193-kube-api-access-5rvl9\") pod \"tuned-5rc24\" (UID: \"d63d695c-f063-4981-a993-07dd8b11f193\") " pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.461611 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.461595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8kxg\" (UniqueName: \"kubernetes.io/projected/9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d-kube-api-access-s8kxg\") pod \"node-resolver-lnsdn\" (UID: \"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d\") " pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.472811 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.472795 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5rc24" Apr 16 13:58:56.479242 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.479226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bb72s" Apr 16 13:58:56.556125 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-cni-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556125 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-netns\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556125 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdlw8\" (UniqueName: \"kubernetes.io/projected/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-kube-api-access-bdlw8\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-multus-certs\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-cni-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-netns\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-system-cni-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-multus-certs\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rpkz\" (UniqueName: \"kubernetes.io/projected/91ffb15b-8d84-4a65-a157-65c7adaca0ea-kube-api-access-5rpkz\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-system-cni-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-os-release\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-cni-multus\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-conf-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-daemon-config\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-kubelet\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-cni-multus\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556367 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556364 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-conf-dir\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-kubelet\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-os-release\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-cnibin\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-k8s-cni-cncf-io\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-cnibin\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-etc-kubernetes\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-etc-kubernetes\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-cni-binary-copy\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-run-k8s-cni-cncf-io\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-socket-dir-parent\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-cni-bin\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-hostroot\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:56.556763 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-host-var-lib-cni-bin\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556806 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-hostroot\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.556989 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-socket-dir-parent\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.557517 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:56.556841 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 13:58:57.056806711 +0000 UTC m=+2.213141749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:56.557517 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.556927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-multus-daemon-config\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.557517 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.557062 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-cni-binary-copy\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.563601 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:56.563585 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:58:56.563651 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:56.563606 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:58:56.563651 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:56.563619 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7shg7 for pod openshift-network-diagnostics/network-check-target-bct9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:56.563719 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:56.563681 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7 podName:fb8eecda-88c7-4d10-97ed-5f758d438dc2 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:57.063665189 +0000 UTC m=+2.220000212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7shg7" (UniqueName: "kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7") pod "network-check-target-bct9b" (UID: "fb8eecda-88c7-4d10-97ed-5f758d438dc2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:56.565010 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.564983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rpkz\" (UniqueName: \"kubernetes.io/projected/91ffb15b-8d84-4a65-a157-65c7adaca0ea-kube-api-access-5rpkz\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:56.565720 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.565701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdlw8\" (UniqueName: \"kubernetes.io/projected/ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7-kube-api-access-bdlw8\") pod \"multus-xv4ws\" (UID: \"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7\") " pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.665386 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.665352 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:58:56.676593 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.676560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:58:56.697222 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.697186 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lnsdn" Apr 16 13:58:56.708941 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.708903 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5p4kh" Apr 16 13:58:56.728631 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.728593 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kpzwx" Apr 16 13:58:56.742394 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.742354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" Apr 16 13:58:56.783599 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.783566 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xv4ws" Apr 16 13:58:56.822607 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:56.822577 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode83a86637d931638418a028ca5b217de.slice/crio-f8dce16abe4290ac2bf76b923ccf7683374609fd7d3e0326a87a4a34236987e8 WatchSource:0}: Error finding container f8dce16abe4290ac2bf76b923ccf7683374609fd7d3e0326a87a4a34236987e8: Status 404 returned error can't find the container with id f8dce16abe4290ac2bf76b923ccf7683374609fd7d3e0326a87a4a34236987e8 Apr 16 13:58:56.822859 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:56.822837 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccd40d4c_3ebd_4f04_b7c0_a807865cb3c7.slice/crio-8e20ef72452461c2c614ab050219efbf0089a0cddcf7f8fd2f3eba5baf164e29 WatchSource:0}: Error finding container 8e20ef72452461c2c614ab050219efbf0089a0cddcf7f8fd2f3eba5baf164e29: Status 404 returned error can't find the container with id 8e20ef72452461c2c614ab050219efbf0089a0cddcf7f8fd2f3eba5baf164e29 Apr 16 13:58:56.827075 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:56.827059 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:58:56.846960 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:56.846935 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee60be0_add8_410b_982e_1aa1f11ec111.slice/crio-d8b778e17367409047402c953afa007dc2c35f0c9c9c1505a68e590cdd035394 WatchSource:0}: Error finding container d8b778e17367409047402c953afa007dc2c35f0c9c9c1505a68e590cdd035394: Status 404 returned error can't find the container with id d8b778e17367409047402c953afa007dc2c35f0c9c9c1505a68e590cdd035394 Apr 16 13:58:56.956382 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:56.956356 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd63d695c_f063_4981_a993_07dd8b11f193.slice/crio-4e928559bae4e3039433b46d52a3edbd4e18a3d553bb6e26e4b45a71ec0688b3 WatchSource:0}: Error finding container 4e928559bae4e3039433b46d52a3edbd4e18a3d553bb6e26e4b45a71ec0688b3: Status 404 returned error can't find the container with id 4e928559bae4e3039433b46d52a3edbd4e18a3d553bb6e26e4b45a71ec0688b3 Apr 16 13:58:57.060158 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.060126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:57.060300 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:57.060224 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:57.060300 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:57.060270 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 13:58:58.060257286 +0000 UTC m=+3.216592304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:57.161183 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.161086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:58:57.161379 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:57.161284 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:58:57.161379 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:57.161308 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:58:57.161379 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:57.161319 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7shg7 for pod openshift-network-diagnostics/network-check-target-bct9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:57.161379 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:57.161380 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7 podName:fb8eecda-88c7-4d10-97ed-5f758d438dc2 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:58.161360456 +0000 UTC m=+3.317695492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7shg7" (UniqueName: "kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7") pod "network-check-target-bct9b" (UID: "fb8eecda-88c7-4d10-97ed-5f758d438dc2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:57.165272 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.165253 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:57.200949 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:57.200920 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59f23d5d_a915_4636_b347_7d14ae37dbed.slice/crio-41a6852e56ce25f27dce464ca6dbaf1b7974437a21e3e4f57475b2e712a33952 WatchSource:0}: Error finding container 41a6852e56ce25f27dce464ca6dbaf1b7974437a21e3e4f57475b2e712a33952: Status 404 returned error can't find the container with id 41a6852e56ce25f27dce464ca6dbaf1b7974437a21e3e4f57475b2e712a33952 Apr 16 13:58:57.264976 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:57.264952 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b92520_2c04_454e_8a4a_c542e9075e22.slice/crio-0a7c17801160df36b98854b3ec8797e4e474c2502e79159a67e2e9d5cded619f WatchSource:0}: Error finding container 0a7c17801160df36b98854b3ec8797e4e474c2502e79159a67e2e9d5cded619f: Status 404 returned error can't find the container with id 0a7c17801160df36b98854b3ec8797e4e474c2502e79159a67e2e9d5cded619f Apr 16 13:58:57.292186 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.292165 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:57.298843 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:57.298819 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b706a4d_2ea5_4651_bfda_d3c5cdc3fe5d.slice/crio-ca6166de3095330a50afd07462f15fd1cc82055a3bb8afef804e30671d52c35a WatchSource:0}: Error finding container ca6166de3095330a50afd07462f15fd1cc82055a3bb8afef804e30671d52c35a: Status 404 returned error can't find the container with id ca6166de3095330a50afd07462f15fd1cc82055a3bb8afef804e30671d52c35a Apr 16 13:58:57.308306 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:57.308279 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode11c6740_8d55_4673_8d82_f90f6a93b413.slice/crio-0b9e3cfffb2a6d6fe02dba97dcf07540291841e5ccb0891cf7f1e979bfd6f6c4 WatchSource:0}: Error finding container 0b9e3cfffb2a6d6fe02dba97dcf07540291841e5ccb0891cf7f1e979bfd6f6c4: Status 404 returned error can't find the container with id 0b9e3cfffb2a6d6fe02dba97dcf07540291841e5ccb0891cf7f1e979bfd6f6c4 Apr 16 13:58:57.361283 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.361261 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:57.374902 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:57.374864 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8725739e_0bdd_4d97_b43d_4551f43cd997.slice/crio-2f4a37825bbd63015d03ad0fab9979f4feae61b159146b621bf20f64799a4930 WatchSource:0}: Error finding container 2f4a37825bbd63015d03ad0fab9979f4feae61b159146b621bf20f64799a4930: Status 404 returned error can't find the container with id 2f4a37825bbd63015d03ad0fab9979f4feae61b159146b621bf20f64799a4930 Apr 16 13:58:57.382065 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.382036 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:53:56 +0000 UTC" deadline="2027-09-27 04:03:08.857145253 +0000 UTC" Apr 16 13:58:57.382152 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.382067 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12686h4m11.475081839s" Apr 16 13:58:57.429601 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:57.429570 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1aecfc38a711305ef53a73c57dbb3d6.slice/crio-ae536aa014974e041f1cf7b6b5a321cf9cd67a7fbce8f8d41976b72db76173a8 WatchSource:0}: Error finding container ae536aa014974e041f1cf7b6b5a321cf9cd67a7fbce8f8d41976b72db76173a8: Status 404 returned error can't find the container with id ae536aa014974e041f1cf7b6b5a321cf9cd67a7fbce8f8d41976b72db76173a8 Apr 16 13:58:57.463373 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.463341 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:58:57.463541 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:57.463485 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:58:57.470422 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.470364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kpzwx" event={"ID":"01b92520-2c04-454e-8a4a-c542e9075e22","Type":"ContainerStarted","Data":"0a7c17801160df36b98854b3ec8797e4e474c2502e79159a67e2e9d5cded619f"} Apr 16 13:58:57.471474 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.471435 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5rc24" event={"ID":"d63d695c-f063-4981-a993-07dd8b11f193","Type":"ContainerStarted","Data":"4e928559bae4e3039433b46d52a3edbd4e18a3d553bb6e26e4b45a71ec0688b3"} Apr 16 13:58:57.473682 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.473656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb72s" event={"ID":"eee60be0-add8-410b-982e-1aa1f11ec111","Type":"ContainerStarted","Data":"d8b778e17367409047402c953afa007dc2c35f0c9c9c1505a68e590cdd035394"} Apr 16 13:58:57.475061 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.475031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv4ws" event={"ID":"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7","Type":"ContainerStarted","Data":"8e20ef72452461c2c614ab050219efbf0089a0cddcf7f8fd2f3eba5baf164e29"} Apr 16 13:58:57.476330 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.476307 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" event={"ID":"b1aecfc38a711305ef53a73c57dbb3d6","Type":"ContainerStarted","Data":"ae536aa014974e041f1cf7b6b5a321cf9cd67a7fbce8f8d41976b72db76173a8"} Apr 16 13:58:57.477478 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.477442 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" event={"ID":"8725739e-0bdd-4d97-b43d-4551f43cd997","Type":"ContainerStarted","Data":"2f4a37825bbd63015d03ad0fab9979f4feae61b159146b621bf20f64799a4930"} Apr 16 13:58:57.478565 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.478524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5p4kh" event={"ID":"e11c6740-8d55-4673-8d82-f90f6a93b413","Type":"ContainerStarted","Data":"0b9e3cfffb2a6d6fe02dba97dcf07540291841e5ccb0891cf7f1e979bfd6f6c4"} Apr 16 13:58:57.479793 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.479768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lnsdn" event={"ID":"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d","Type":"ContainerStarted","Data":"ca6166de3095330a50afd07462f15fd1cc82055a3bb8afef804e30671d52c35a"} Apr 16 13:58:57.481049 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.481025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g9npf" event={"ID":"59f23d5d-a915-4636-b347-7d14ae37dbed","Type":"ContainerStarted","Data":"41a6852e56ce25f27dce464ca6dbaf1b7974437a21e3e4f57475b2e712a33952"} Apr 16 13:58:57.482371 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:57.482349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" event={"ID":"e83a86637d931638418a028ca5b217de","Type":"ContainerStarted","Data":"f8dce16abe4290ac2bf76b923ccf7683374609fd7d3e0326a87a4a34236987e8"} Apr 16 13:58:57.857076 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:58:57.857039 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40fcc100_8a15_40b9_a4d8_8c9913394f91.slice/crio-880ea21bf0806c48eda15845867dc37909c6d17b1d0bc7d0ab7136755d1a3f03 WatchSource:0}: Error finding container 880ea21bf0806c48eda15845867dc37909c6d17b1d0bc7d0ab7136755d1a3f03: Status 404 returned error can't find the container with id 880ea21bf0806c48eda15845867dc37909c6d17b1d0bc7d0ab7136755d1a3f03 Apr 16 13:58:58.068364 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:58.068327 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:58.068550 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:58.068486 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:58.068675 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:58.068614 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 13:59:00.06859477 +0000 UTC m=+5.224929809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:58.169749 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:58.169663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:58:58.169938 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:58.169851 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:58:58.169938 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:58.169880 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:58:58.169938 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:58.169902 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7shg7 for pod openshift-network-diagnostics/network-check-target-bct9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:58.170129 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:58.169965 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7 podName:fb8eecda-88c7-4d10-97ed-5f758d438dc2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:00.169944723 +0000 UTC m=+5.326279745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7shg7" (UniqueName: "kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7") pod "network-check-target-bct9b" (UID: "fb8eecda-88c7-4d10-97ed-5f758d438dc2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:58.383147 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:58.383104 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:53:56 +0000 UTC" deadline="2027-12-20 01:38:34.440078918 +0000 UTC" Apr 16 13:58:58.383147 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:58.383146 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14699h39m36.056937163s" Apr 16 13:58:58.463808 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:58.463727 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:58:58.463989 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:58.463867 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:58:58.498671 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:58.498634 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"880ea21bf0806c48eda15845867dc37909c6d17b1d0bc7d0ab7136755d1a3f03"} Apr 16 13:58:59.467489 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:58:59.467447 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:58:59.467939 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:58:59.467574 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:00.083274 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:00.083226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:00.083484 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:00.083424 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:00.083572 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:00.083507 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 13:59:04.083488279 +0000 UTC m=+9.239823300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:00.183884 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:00.183839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:00.184082 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:00.184017 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:00.184082 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:00.184042 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:00.184082 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:00.184056 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7shg7 for pod openshift-network-diagnostics/network-check-target-bct9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:00.184249 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:00.184115 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7 podName:fb8eecda-88c7-4d10-97ed-5f758d438dc2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:04.184095118 +0000 UTC m=+9.340430152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7shg7" (UniqueName: "kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7") pod "network-check-target-bct9b" (UID: "fb8eecda-88c7-4d10-97ed-5f758d438dc2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:00.463492 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:00.463405 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:00.463654 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:00.463553 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:01.469179 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:01.469145 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:01.469675 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:01.469283 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:02.463428 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:02.463390 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:02.463613 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:02.463558 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:03.466755 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:03.466683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:03.467197 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:03.466814 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:04.120032 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:04.119641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:04.120032 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:04.119830 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:04.120032 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:04.119897 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 13:59:12.119878683 +0000 UTC m=+17.276213708 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:04.221252 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:04.220849 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:04.221252 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:04.221033 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:04.221252 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:04.221054 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:04.221252 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:04.221067 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7shg7 for pod openshift-network-diagnostics/network-check-target-bct9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:04.221252 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:04.221123 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7 podName:fb8eecda-88c7-4d10-97ed-5f758d438dc2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:12.221105949 +0000 UTC m=+17.377440970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7shg7" (UniqueName: "kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7") pod "network-check-target-bct9b" (UID: "fb8eecda-88c7-4d10-97ed-5f758d438dc2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:04.463429 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:04.463353 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:04.463614 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:04.463516 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:05.464107 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:05.464020 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:05.464600 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:05.464138 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:06.463928 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:06.463635 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:06.464098 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:06.464015 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:07.466655 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:07.466608 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:07.467057 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:07.466723 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:08.463443 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:08.463398 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:08.463637 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:08.463548 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:09.465611 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:09.465583 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:09.465964 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:09.465689 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:10.463725 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:10.463691 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:10.463891 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:10.463804 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:11.463419 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:11.463328 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:11.463862 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:11.463485 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:12.182254 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:12.182211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:12.182478 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:12.182357 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:12.182478 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:12.182421 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 13:59:28.182406837 +0000 UTC m=+33.338741873 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:12.283263 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:12.283218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:12.283499 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:12.283404 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:12.283499 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:12.283429 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:12.283499 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:12.283444 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7shg7 for pod openshift-network-diagnostics/network-check-target-bct9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:12.283654 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:12.283528 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7 podName:fb8eecda-88c7-4d10-97ed-5f758d438dc2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:28.283509411 +0000 UTC m=+33.439844447 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7shg7" (UniqueName: "kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7") pod "network-check-target-bct9b" (UID: "fb8eecda-88c7-4d10-97ed-5f758d438dc2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:12.463707 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:12.463625 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:12.464139 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:12.463740 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:13.466562 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:13.466534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:13.466989 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:13.466656 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:14.463154 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:14.463119 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:14.463346 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:14.463238 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:15.464415 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:15.464196 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:15.464415 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:15.464316 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:15.535369 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:15.535332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5rc24" event={"ID":"d63d695c-f063-4981-a993-07dd8b11f193","Type":"ContainerStarted","Data":"1f87981db8cf366e08e8d01d20c6fb0b3dbb5bc3ee4e5102ed38dd9c2df1f9b7"} Apr 16 13:59:15.537355 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:15.537328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv4ws" event={"ID":"ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7","Type":"ContainerStarted","Data":"9bd7d98ff0f0981977d814f7cf47381c8114a290e93e8d193468513479c0c28a"} Apr 16 13:59:15.540396 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:15.540373 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" event={"ID":"b1aecfc38a711305ef53a73c57dbb3d6","Type":"ContainerStarted","Data":"a519104b313b11719d569f668af684e0e7637633143ff9a295d2d85b15422f4e"} Apr 16 13:59:15.545902 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:15.545871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"9b58247d75d47ced0c1896adc1d57fb7a62e98f2fbf5ca378a851b8a1ebdf066"} Apr 16 13:59:15.546012 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:15.545909 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"aeced0f6455cd7f63b44465fd5a69a0b5a16f4aae80dfe45f6e84385e70b8503"} Apr 16 13:59:15.553786 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:15.552371 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5rc24" podStartSLOduration=2.272393705 podStartE2EDuration="20.552354541s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:56.957965025 +0000 UTC m=+2.114300043" lastFinishedPulling="2026-04-16 13:59:15.237925858 +0000 UTC m=+20.394260879" observedRunningTime="2026-04-16 13:59:15.551223659 +0000 UTC m=+20.707558701" watchObservedRunningTime="2026-04-16 13:59:15.552354541 +0000 UTC m=+20.708689583" Apr 16 13:59:15.564938 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:15.564424 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-59.ec2.internal" podStartSLOduration=20.564407477 podStartE2EDuration="20.564407477s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:15.563753247 +0000 UTC m=+20.720088288" watchObservedRunningTime="2026-04-16 13:59:15.564407477 +0000 UTC m=+20.720742519" Apr 16 13:59:16.463880 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.463849 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:16.464053 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:16.464029 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:16.549036 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.549002 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" event={"ID":"8725739e-0bdd-4d97-b43d-4551f43cd997","Type":"ContainerStarted","Data":"c9cbbdfa1c1e98d3f64bfaeff412d1e65c904a7a106b4ea90df7eb16e2e24ace"} Apr 16 13:59:16.550389 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.550360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5p4kh" event={"ID":"e11c6740-8d55-4673-8d82-f90f6a93b413","Type":"ContainerStarted","Data":"6a16966f44de80ddbfd3da2a16cba5f7eb24a585ca334780f59b6004011bd83f"} Apr 16 13:59:16.551869 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.551842 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lnsdn" event={"ID":"9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d","Type":"ContainerStarted","Data":"b03bc533b52651c45be461c7be9be405425d74d245b0ed08e369cd52e07a364a"} Apr 16 13:59:16.553249 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.553232 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g9npf" event={"ID":"59f23d5d-a915-4636-b347-7d14ae37dbed","Type":"ContainerStarted","Data":"04fb3eece2a00c1838dcd98bcf2a114220a95373e15165467e9be2571e6ddb9f"} Apr 16 13:59:16.554614 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.554593 2572 generic.go:358] "Generic (PLEG): container finished" podID="e83a86637d931638418a028ca5b217de" containerID="110aeef2cdb95e2fa9305f2dee2b24c97793b874190e710711ec027ab6288eb0" exitCode=0 Apr 16 13:59:16.554709 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.554676 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" event={"ID":"e83a86637d931638418a028ca5b217de","Type":"ContainerDied","Data":"110aeef2cdb95e2fa9305f2dee2b24c97793b874190e710711ec027ab6288eb0"} Apr 16 13:59:16.557689 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.557667 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"cacd5edd2299958e8f3dee2b9a04ee3a6c50be5f725719f07be557e65f2e5930"} Apr 16 13:59:16.557784 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.557695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"3da2d7909e84f58aad11cbb09d2d43073b5e55d78e44be1056445d77fb8cdd35"} Apr 16 13:59:16.557784 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.557719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"668f23c7b4a03b385923f8112d54c7480bdde6de9abb0ed0685e41021a00d67b"} Apr 16 13:59:16.557784 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.557728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"591c459360921b81b46376663b1e62c3351d272a398c33007ebde3cea2e8c0fd"} Apr 16 13:59:16.558979 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.558959 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kpzwx" event={"ID":"01b92520-2c04-454e-8a4a-c542e9075e22","Type":"ContainerStarted","Data":"b54f587d4224ffeb3f20295de3c0db9c2aa3cc15325169197f72175f8d64619b"} Apr 16 13:59:16.560399 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.560375 2572 generic.go:358] "Generic (PLEG): container finished" podID="eee60be0-add8-410b-982e-1aa1f11ec111" containerID="069312bfa85262b6fbb5c2424ddfacec872fde786355e38161ceabd81ccb73e4" exitCode=0 Apr 16 13:59:16.560513 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.560483 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb72s" event={"ID":"eee60be0-add8-410b-982e-1aa1f11ec111","Type":"ContainerDied","Data":"069312bfa85262b6fbb5c2424ddfacec872fde786355e38161ceabd81ccb73e4"} Apr 16 13:59:16.563499 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.563441 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xv4ws" podStartSLOduration=3.146973794 podStartE2EDuration="21.563425269s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:56.827315849 +0000 UTC m=+1.983650875" lastFinishedPulling="2026-04-16 13:59:15.243767325 +0000 UTC m=+20.400102350" observedRunningTime="2026-04-16 13:59:15.580029715 +0000 UTC m=+20.736364757" watchObservedRunningTime="2026-04-16 13:59:16.563425269 +0000 UTC m=+21.719760312" Apr 16 13:59:16.563976 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.563943 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5p4kh" podStartSLOduration=7.93851494 podStartE2EDuration="21.563934302s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:57.309799182 +0000 UTC m=+2.466134200" lastFinishedPulling="2026-04-16 13:59:10.93521854 +0000 UTC m=+16.091553562" observedRunningTime="2026-04-16 13:59:16.563378663 +0000 UTC m=+21.719713707" watchObservedRunningTime="2026-04-16 13:59:16.563934302 +0000 UTC m=+21.720269340" Apr 16 13:59:16.581510 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.581466 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g9npf" podStartSLOduration=13.585590035 podStartE2EDuration="21.581435413s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:57.202553271 +0000 UTC m=+2.358888290" lastFinishedPulling="2026-04-16 13:59:05.19839864 +0000 UTC m=+10.354733668" observedRunningTime="2026-04-16 13:59:16.580753397 +0000 UTC m=+21.737088439" watchObservedRunningTime="2026-04-16 13:59:16.581435413 +0000 UTC m=+21.737770454" Apr 16 13:59:16.610243 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.610188 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lnsdn" podStartSLOduration=7.978422388 podStartE2EDuration="21.610168849s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:57.303437701 +0000 UTC m=+2.459772720" lastFinishedPulling="2026-04-16 13:59:10.935184147 +0000 UTC m=+16.091519181" observedRunningTime="2026-04-16 13:59:16.609828389 +0000 UTC m=+21.766163430" watchObservedRunningTime="2026-04-16 13:59:16.610168849 +0000 UTC m=+21.766503891" Apr 16 13:59:16.624801 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.624739 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kpzwx" podStartSLOduration=3.65185228 podStartE2EDuration="21.624720572s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:57.266417309 +0000 UTC m=+2.422752328" lastFinishedPulling="2026-04-16 13:59:15.239285599 +0000 UTC m=+20.395620620" observedRunningTime="2026-04-16 13:59:16.624426886 +0000 UTC m=+21.780761928" watchObservedRunningTime="2026-04-16 13:59:16.624720572 +0000 UTC m=+21.781055614" Apr 16 13:59:16.977172 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:16.977149 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:17.035603 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:17.035571 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:59:17.422241 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:17.422148 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:16.977168496Z","UUID":"1164dad2-570f-400d-b599-57ec9e9e64fc","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:17.423662 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:17.423637 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:17.423662 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:17.423666 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:17.466344 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:17.466318 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:17.466572 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:17.466410 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:17.563832 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:17.563798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" event={"ID":"8725739e-0bdd-4d97-b43d-4551f43cd997","Type":"ContainerStarted","Data":"69abadea30866370548f0e8a09d17539ed30b1011f9ef4f49d926587e58cc10e"} Apr 16 13:59:17.565311 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:17.565265 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" event={"ID":"e83a86637d931638418a028ca5b217de","Type":"ContainerStarted","Data":"4f93fc0220385be07c5e747b18eb931e72ba579c8a98d9e7d5176c0b931d2172"} Apr 16 13:59:17.578835 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:17.578794 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-59.ec2.internal" podStartSLOduration=22.578780109 podStartE2EDuration="22.578780109s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:17.578428395 +0000 UTC m=+22.734763433" watchObservedRunningTime="2026-04-16 13:59:17.578780109 +0000 UTC m=+22.735115149" Apr 16 13:59:18.463879 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:18.463840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:18.464052 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:18.463967 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:18.569685 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:18.569370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" event={"ID":"8725739e-0bdd-4d97-b43d-4551f43cd997","Type":"ContainerStarted","Data":"a3f17aa65128e23a6b8e2b724a4b21739abe8b226190b4aca8cb7a22caf3f506"} Apr 16 13:59:18.572735 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:18.572695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"e2b3aea23d33717f4f5e759e0ad63f9914477e0090b7c851375b3161f3a884e9"} Apr 16 13:59:18.585251 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:18.585181 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cqlj" podStartSLOduration=2.689066183 podStartE2EDuration="23.585167199s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:57.37749548 +0000 UTC m=+2.533830514" lastFinishedPulling="2026-04-16 13:59:18.273596511 +0000 UTC m=+23.429931530" observedRunningTime="2026-04-16 13:59:18.585091298 +0000 UTC m=+23.741426339" watchObservedRunningTime="2026-04-16 13:59:18.585167199 +0000 UTC m=+23.741502240" Apr 16 13:59:18.961737 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:18.961630 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:59:18.962547 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:18.962515 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:59:19.467098 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:19.467068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:19.467258 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:19.467197 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:19.575469 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:19.575423 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g9npf" Apr 16 13:59:20.463298 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:20.463211 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:20.463472 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:20.463352 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:21.466422 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:21.466392 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:21.466797 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:21.466542 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:22.464094 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:22.463858 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:22.464263 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:22.464145 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:22.581660 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:22.581624 2572 generic.go:358] "Generic (PLEG): container finished" podID="eee60be0-add8-410b-982e-1aa1f11ec111" containerID="f0edb3793cfffda662891e60cf0959831b8ad5b236a39b09d991928df36da9e3" exitCode=0 Apr 16 13:59:22.582422 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:22.581708 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb72s" event={"ID":"eee60be0-add8-410b-982e-1aa1f11ec111","Type":"ContainerDied","Data":"f0edb3793cfffda662891e60cf0959831b8ad5b236a39b09d991928df36da9e3"} Apr 16 13:59:22.585321 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:22.585304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" event={"ID":"40fcc100-8a15-40b9-a4d8-8c9913394f91","Type":"ContainerStarted","Data":"b843a5b64ce65229cb1bf981c7d9626a0c9ce02d326f42a46fbcf00d1cb427be"} Apr 16 13:59:22.585635 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:22.585622 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:59:22.585685 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:22.585642 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:59:22.599797 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:22.599779 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:59:22.623837 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:22.623795 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" podStartSLOduration=10.139651002 podStartE2EDuration="27.623781463s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:57.859803853 +0000 UTC m=+3.016138874" lastFinishedPulling="2026-04-16 13:59:15.343934312 +0000 UTC m=+20.500269335" observedRunningTime="2026-04-16 13:59:22.622949714 +0000 UTC m=+27.779284776" watchObservedRunningTime="2026-04-16 13:59:22.623781463 +0000 UTC m=+27.780116702" Apr 16 13:59:23.463437 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:23.463401 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:23.463684 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:23.463533 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:23.589128 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:23.589065 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:59:23.608535 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:23.607963 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:59:23.836374 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:23.836341 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bct9b"] Apr 16 13:59:23.836555 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:23.836513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:23.836632 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:23.836607 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:23.838988 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:23.838966 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6bp8d"] Apr 16 13:59:23.839102 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:23.839063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:23.839158 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:23.839142 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:24.591961 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:24.591928 2572 generic.go:358] "Generic (PLEG): container finished" podID="eee60be0-add8-410b-982e-1aa1f11ec111" containerID="a2cb9f804e33d21e2d7ce2010edd9b3e4ac0016c18595bf518d8c0a32e3eb67e" exitCode=0 Apr 16 13:59:24.592360 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:24.591965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb72s" event={"ID":"eee60be0-add8-410b-982e-1aa1f11ec111","Type":"ContainerDied","Data":"a2cb9f804e33d21e2d7ce2010edd9b3e4ac0016c18595bf518d8c0a32e3eb67e"} Apr 16 13:59:25.464621 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:25.464592 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:25.464789 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:25.464711 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:25.464789 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:25.464767 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:25.464913 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:25.464886 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:25.596175 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:25.596140 2572 generic.go:358] "Generic (PLEG): container finished" podID="eee60be0-add8-410b-982e-1aa1f11ec111" containerID="f6c3984c5f68fd0dc35de40467c2267eeb43f1c5d62ca8efe0d94c1d04c3716c" exitCode=0 Apr 16 13:59:25.596558 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:25.596231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb72s" event={"ID":"eee60be0-add8-410b-982e-1aa1f11ec111","Type":"ContainerDied","Data":"f6c3984c5f68fd0dc35de40467c2267eeb43f1c5d62ca8efe0d94c1d04c3716c"} Apr 16 13:59:27.463777 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:27.463741 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:27.464218 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:27.463783 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:27.464218 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:27.463879 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bct9b" podUID="fb8eecda-88c7-4d10-97ed-5f758d438dc2" Apr 16 13:59:27.464218 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:27.463998 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 13:59:28.200103 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.200066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:28.200343 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.200220 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:28.200343 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.200293 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 14:00:00.200277497 +0000 UTC m=+65.356612516 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:28.208361 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.208334 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-59.ec2.internal" event="NodeReady" Apr 16 13:59:28.208509 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.208500 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:28.252511 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.252479 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nbj9l"] Apr 16 13:59:28.254626 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.254367 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.254806 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.254784 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4mk8h"] Apr 16 13:59:28.256341 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.256320 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:28.256713 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.256692 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:28.256849 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.256828 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2c9fp\"" Apr 16 13:59:28.256924 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.256846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:28.258467 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.258439 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wz4rp\"" Apr 16 13:59:28.258568 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.258481 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:28.258568 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.258492 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:28.258568 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.258487 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:28.264038 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.264020 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nbj9l"] Apr 16 13:59:28.266798 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.266769 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4mk8h"] Apr 16 13:59:28.300468 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.300427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:28.300629 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.300608 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:28.300678 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.300634 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:28.300678 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.300648 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7shg7 for pod openshift-network-diagnostics/network-check-target-bct9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:28.300766 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.300715 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7 podName:fb8eecda-88c7-4d10-97ed-5f758d438dc2 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:00.300695797 +0000 UTC m=+65.457030831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7shg7" (UniqueName: "kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7") pod "network-check-target-bct9b" (UID: "fb8eecda-88c7-4d10-97ed-5f758d438dc2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:28.400766 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.400731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89be9464-73dc-4031-a7c8-03fa1b9164f2-config-volume\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.400766 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.400767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89be9464-73dc-4031-a7c8-03fa1b9164f2-tmp-dir\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.401006 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.400798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826dv\" (UniqueName: \"kubernetes.io/projected/89be9464-73dc-4031-a7c8-03fa1b9164f2-kube-api-access-826dv\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.401006 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.400880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:28.401006 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.400919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82ssb\" (UniqueName: \"kubernetes.io/projected/0707b413-706b-4c25-9e10-ea274017e762-kube-api-access-82ssb\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:28.401006 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.400936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.501598 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.501503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:28.501598 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.501581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82ssb\" (UniqueName: \"kubernetes.io/projected/0707b413-706b-4c25-9e10-ea274017e762-kube-api-access-82ssb\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:28.502078 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.501610 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.502078 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.501656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89be9464-73dc-4031-a7c8-03fa1b9164f2-config-volume\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.502078 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.501675 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:28.502078 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.501712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89be9464-73dc-4031-a7c8-03fa1b9164f2-tmp-dir\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.502078 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.501733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-826dv\" (UniqueName: \"kubernetes.io/projected/89be9464-73dc-4031-a7c8-03fa1b9164f2-kube-api-access-826dv\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.502078 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.501755 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert podName:0707b413-706b-4c25-9e10-ea274017e762 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.00173054 +0000 UTC m=+34.158065564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert") pod "ingress-canary-4mk8h" (UID: "0707b413-706b-4c25-9e10-ea274017e762") : secret "canary-serving-cert" not found Apr 16 13:59:28.502078 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.501956 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:28.502078 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:28.501999 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls podName:89be9464-73dc-4031-a7c8-03fa1b9164f2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.001985724 +0000 UTC m=+34.158320743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls") pod "dns-default-nbj9l" (UID: "89be9464-73dc-4031-a7c8-03fa1b9164f2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:28.502348 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.502193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89be9464-73dc-4031-a7c8-03fa1b9164f2-tmp-dir\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.502348 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.502290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89be9464-73dc-4031-a7c8-03fa1b9164f2-config-volume\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.512762 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.512627 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-826dv\" (UniqueName: \"kubernetes.io/projected/89be9464-73dc-4031-a7c8-03fa1b9164f2-kube-api-access-826dv\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:28.512939 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:28.512912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82ssb\" (UniqueName: \"kubernetes.io/projected/0707b413-706b-4c25-9e10-ea274017e762-kube-api-access-82ssb\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:29.005831 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.005789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:29.006029 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.005873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:29.006029 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:29.005913 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:29.006029 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:29.005991 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:29.006029 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:29.005995 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert podName:0707b413-706b-4c25-9e10-ea274017e762 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:30.005974376 +0000 UTC m=+35.162309417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert") pod "ingress-canary-4mk8h" (UID: "0707b413-706b-4c25-9e10-ea274017e762") : secret "canary-serving-cert" not found Apr 16 13:59:29.006243 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:29.006054 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls podName:89be9464-73dc-4031-a7c8-03fa1b9164f2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:30.006037061 +0000 UTC m=+35.162372088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls") pod "dns-default-nbj9l" (UID: "89be9464-73dc-4031-a7c8-03fa1b9164f2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:29.463817 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.463778 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 13:59:29.464052 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.464017 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 13:59:29.466610 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.466588 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:29.466834 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.466776 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:29.466915 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.466889 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:29.467698 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.467678 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j4r87\"" Apr 16 13:59:29.467778 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:29.467680 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b98gv\"" Apr 16 13:59:30.013570 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:30.013526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:30.014026 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:30.013595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:30.014026 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:30.013686 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:30.014026 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:30.013689 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:30.014026 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:30.013739 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls podName:89be9464-73dc-4031-a7c8-03fa1b9164f2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:32.013723747 +0000 UTC m=+37.170058765 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls") pod "dns-default-nbj9l" (UID: "89be9464-73dc-4031-a7c8-03fa1b9164f2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:30.014026 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:30.013752 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert podName:0707b413-706b-4c25-9e10-ea274017e762 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:32.013745742 +0000 UTC m=+37.170080760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert") pod "ingress-canary-4mk8h" (UID: "0707b413-706b-4c25-9e10-ea274017e762") : secret "canary-serving-cert" not found Apr 16 13:59:32.029885 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:32.029858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:32.030320 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:32.029939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:32.030320 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:32.030027 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:32.030320 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:32.030089 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:32.030320 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:32.030113 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls podName:89be9464-73dc-4031-a7c8-03fa1b9164f2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:36.030094001 +0000 UTC m=+41.186429033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls") pod "dns-default-nbj9l" (UID: "89be9464-73dc-4031-a7c8-03fa1b9164f2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:32.030320 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:32.030147 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert podName:0707b413-706b-4c25-9e10-ea274017e762 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:36.030129772 +0000 UTC m=+41.186464814 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert") pod "ingress-canary-4mk8h" (UID: "0707b413-706b-4c25-9e10-ea274017e762") : secret "canary-serving-cert" not found Apr 16 13:59:33.614082 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:33.614043 2572 generic.go:358] "Generic (PLEG): container finished" podID="eee60be0-add8-410b-982e-1aa1f11ec111" containerID="9e18d7573a89d4ba15a645fa988cc1eef9a2185b8a5cbf76c2eada151a42dbe9" exitCode=0 Apr 16 13:59:33.614701 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:33.614094 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb72s" event={"ID":"eee60be0-add8-410b-982e-1aa1f11ec111","Type":"ContainerDied","Data":"9e18d7573a89d4ba15a645fa988cc1eef9a2185b8a5cbf76c2eada151a42dbe9"} Apr 16 13:59:34.618758 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:34.618727 2572 generic.go:358] "Generic (PLEG): container finished" podID="eee60be0-add8-410b-982e-1aa1f11ec111" containerID="0bbd919e976c201c37f5c8e50713ceb0e2865f6014e5844d38654f75943bd27a" exitCode=0 Apr 16 13:59:34.619234 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:34.618766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb72s" event={"ID":"eee60be0-add8-410b-982e-1aa1f11ec111","Type":"ContainerDied","Data":"0bbd919e976c201c37f5c8e50713ceb0e2865f6014e5844d38654f75943bd27a"} Apr 16 13:59:35.623029 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:35.622993 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb72s" event={"ID":"eee60be0-add8-410b-982e-1aa1f11ec111","Type":"ContainerStarted","Data":"febd0c39f2c44aa06f7605c856e7e48959dac5511bbe72cba538890054b2ffa2"} Apr 16 13:59:35.643867 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:35.643822 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bb72s" podStartSLOduration=4.823519525 podStartE2EDuration="40.643807074s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 13:58:56.848443842 +0000 UTC m=+2.004778862" lastFinishedPulling="2026-04-16 13:59:32.668731393 +0000 UTC m=+37.825066411" observedRunningTime="2026-04-16 13:59:35.642562882 +0000 UTC m=+40.798897923" watchObservedRunningTime="2026-04-16 13:59:35.643807074 +0000 UTC m=+40.800142114" Apr 16 13:59:36.058980 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:36.058947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:36.059136 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:36.059023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:36.059136 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:36.059087 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:36.059136 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:36.059111 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:36.059235 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:36.059150 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls podName:89be9464-73dc-4031-a7c8-03fa1b9164f2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:44.059134631 +0000 UTC m=+49.215469650 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls") pod "dns-default-nbj9l" (UID: "89be9464-73dc-4031-a7c8-03fa1b9164f2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:36.059235 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:36.059163 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert podName:0707b413-706b-4c25-9e10-ea274017e762 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:44.059157566 +0000 UTC m=+49.215492585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert") pod "ingress-canary-4mk8h" (UID: "0707b413-706b-4c25-9e10-ea274017e762") : secret "canary-serving-cert" not found Apr 16 13:59:44.114273 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:44.114226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 13:59:44.114826 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:44.114295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 13:59:44.114826 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:44.114393 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:44.114826 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:44.114468 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls podName:89be9464-73dc-4031-a7c8-03fa1b9164f2 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:00.114438816 +0000 UTC m=+65.270773834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls") pod "dns-default-nbj9l" (UID: "89be9464-73dc-4031-a7c8-03fa1b9164f2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:44.114826 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:44.114393 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:44.114826 ip-10-0-140-59 kubenswrapper[2572]: E0416 13:59:44.114529 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert podName:0707b413-706b-4c25-9e10-ea274017e762 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:00.114518631 +0000 UTC m=+65.270853667 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert") pod "ingress-canary-4mk8h" (UID: "0707b413-706b-4c25-9e10-ea274017e762") : secret "canary-serving-cert" not found Apr 16 13:59:55.606345 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:55.606316 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppbcs" Apr 16 13:59:58.520575 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.520539 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s"] Apr 16 13:59:58.525074 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.525056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.527188 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.527168 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 13:59:58.527284 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.527176 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 13:59:58.527427 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.527413 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 13:59:58.528090 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.528077 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 13:59:58.532591 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.532567 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s"] Apr 16 13:59:58.553361 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.553329 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw"] Apr 16 13:59:58.556300 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.556286 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.558609 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.558589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 13:59:58.558739 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.558621 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 13:59:58.558739 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.558684 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 13:59:58.558739 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.558722 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 13:59:58.563732 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.563711 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw"] Apr 16 13:59:58.611902 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.611867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-tmp\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.611902 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.611897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-klusterlet-config\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.612104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.611929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blsgp\" (UniqueName: \"kubernetes.io/projected/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-kube-api-access-blsgp\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.612104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.611954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gb7\" (UniqueName: \"kubernetes.io/projected/dcf1dd91-6854-4854-9c39-44ac8bf04253-kube-api-access-f7gb7\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.612104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.611971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/dcf1dd91-6854-4854-9c39-44ac8bf04253-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.612104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.611990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-hub\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.612104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.612003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.612104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.612017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.612104 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.612080 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-ca\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.712823 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.712782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gb7\" (UniqueName: \"kubernetes.io/projected/dcf1dd91-6854-4854-9c39-44ac8bf04253-kube-api-access-f7gb7\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.712999 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.712837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/dcf1dd91-6854-4854-9c39-44ac8bf04253-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.712999 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.712872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-hub\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.712999 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.712895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.712999 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.712918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.712999 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.712972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-ca\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.713239 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.713179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-tmp\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.713289 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.713240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-klusterlet-config\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.713344 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.713300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blsgp\" (UniqueName: \"kubernetes.io/projected/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-kube-api-access-blsgp\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.713687 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.713662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/dcf1dd91-6854-4854-9c39-44ac8bf04253-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.713874 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.713801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-tmp\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.715914 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.715884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.715995 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.715918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-ca\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.716069 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.716048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-hub\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.716159 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.716144 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dcf1dd91-6854-4854-9c39-44ac8bf04253-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.716605 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.716587 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-klusterlet-config\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.720272 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.720247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blsgp\" (UniqueName: \"kubernetes.io/projected/cfdbcecf-4854-4fd5-8583-7d3fbf14371c-kube-api-access-blsgp\") pod \"klusterlet-addon-workmgr-84d7955d9d-2lh4s\" (UID: \"cfdbcecf-4854-4fd5-8583-7d3fbf14371c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.720375 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.720358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gb7\" (UniqueName: \"kubernetes.io/projected/dcf1dd91-6854-4854-9c39-44ac8bf04253-kube-api-access-f7gb7\") pod \"cluster-proxy-proxy-agent-65f8869c57-w9xpw\" (UID: \"dcf1dd91-6854-4854-9c39-44ac8bf04253\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.834774 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.834731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 13:59:58.880971 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.880937 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 13:59:58.968319 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:58.968255 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s"] Apr 16 13:59:58.972869 ip-10-0-140-59 kubenswrapper[2572]: W0416 13:59:58.972834 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfdbcecf_4854_4fd5_8583_7d3fbf14371c.slice/crio-af7f60876fb976711e542e9900f6917db77da2dac74c89068b66099458b5393f WatchSource:0}: Error finding container af7f60876fb976711e542e9900f6917db77da2dac74c89068b66099458b5393f: Status 404 returned error can't find the container with id af7f60876fb976711e542e9900f6917db77da2dac74c89068b66099458b5393f Apr 16 13:59:59.012685 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:59.012653 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw"] Apr 16 13:59:59.670288 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:59.670247 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" event={"ID":"dcf1dd91-6854-4854-9c39-44ac8bf04253","Type":"ContainerStarted","Data":"0925a8f129b2f58d9bcbc7809972d54f8fccc8b3d685756c4fec8a04d6366fe9"} Apr 16 13:59:59.671290 ip-10-0-140-59 kubenswrapper[2572]: I0416 13:59:59.671262 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" event={"ID":"cfdbcecf-4854-4fd5-8583-7d3fbf14371c","Type":"ContainerStarted","Data":"af7f60876fb976711e542e9900f6917db77da2dac74c89068b66099458b5393f"} Apr 16 14:00:00.122408 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.122366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 14:00:00.122620 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.122475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 14:00:00.122620 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:00.122515 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:00.122620 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:00.122591 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert podName:0707b413-706b-4c25-9e10-ea274017e762 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:32.122571144 +0000 UTC m=+97.278906164 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert") pod "ingress-canary-4mk8h" (UID: "0707b413-706b-4c25-9e10-ea274017e762") : secret "canary-serving-cert" not found Apr 16 14:00:00.122797 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:00.122641 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:00.122797 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:00.122694 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls podName:89be9464-73dc-4031-a7c8-03fa1b9164f2 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:32.12267725 +0000 UTC m=+97.279012272 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls") pod "dns-default-nbj9l" (UID: "89be9464-73dc-4031-a7c8-03fa1b9164f2") : secret "dns-default-metrics-tls" not found Apr 16 14:00:00.223959 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.223433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 14:00:00.226124 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.225896 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:00.234164 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:00.233839 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:00.234164 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:00.233931 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 14:01:04.233891749 +0000 UTC m=+129.390226773 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : secret "metrics-daemon-secret" not found Apr 16 14:00:00.324573 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.324080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 14:00:00.326867 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.326628 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:00.336542 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.336396 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:00.353648 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.353606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shg7\" (UniqueName: \"kubernetes.io/projected/fb8eecda-88c7-4d10-97ed-5f758d438dc2-kube-api-access-7shg7\") pod \"network-check-target-bct9b\" (UID: \"fb8eecda-88c7-4d10-97ed-5f758d438dc2\") " pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 14:00:00.379370 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.379240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j4r87\"" Apr 16 14:00:00.387474 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.387037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 14:00:00.538420 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.538202 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bct9b"] Apr 16 14:00:00.676161 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:00.676027 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bct9b" event={"ID":"fb8eecda-88c7-4d10-97ed-5f758d438dc2","Type":"ContainerStarted","Data":"37b3c14aef329106deb394710b7935a3395a5270e9ea87e5889cae702562610e"} Apr 16 14:00:03.685203 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:03.684727 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" event={"ID":"dcf1dd91-6854-4854-9c39-44ac8bf04253","Type":"ContainerStarted","Data":"8110583ae999877afea1c8e6c875f82b72a68ffd674d1251e20a8dbb714f494c"} Apr 16 14:00:03.686540 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:03.686511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" event={"ID":"cfdbcecf-4854-4fd5-8583-7d3fbf14371c","Type":"ContainerStarted","Data":"60dafc06f9ca1077ef04b7caf85d89ac635a0ba571e9f382bdbcc24f9f25b01b"} Apr 16 14:00:03.686804 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:03.686754 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 14:00:03.688019 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:03.688002 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" Apr 16 14:00:03.703535 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:03.703479 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d7955d9d-2lh4s" podStartSLOduration=1.104730437 podStartE2EDuration="5.703443984s" podCreationTimestamp="2026-04-16 13:59:58 +0000 UTC" firstStartedPulling="2026-04-16 13:59:58.974711416 +0000 UTC m=+64.131046448" lastFinishedPulling="2026-04-16 14:00:03.573424961 +0000 UTC m=+68.729759995" observedRunningTime="2026-04-16 14:00:03.703189106 +0000 UTC m=+68.859524144" watchObservedRunningTime="2026-04-16 14:00:03.703443984 +0000 UTC m=+68.859779025" Apr 16 14:00:05.691308 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:05.691268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bct9b" event={"ID":"fb8eecda-88c7-4d10-97ed-5f758d438dc2","Type":"ContainerStarted","Data":"9007ba54725109517c9790b89127241cc6c9ad41e4cbb89a229bb593e5969f8d"} Apr 16 14:00:05.691838 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:05.691430 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 14:00:05.705662 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:05.705612 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bct9b" podStartSLOduration=66.576917934 podStartE2EDuration="1m10.705596862s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 14:00:00.544791104 +0000 UTC m=+65.701126127" lastFinishedPulling="2026-04-16 14:00:04.673470035 +0000 UTC m=+69.829805055" observedRunningTime="2026-04-16 14:00:05.70518424 +0000 UTC m=+70.861519280" watchObservedRunningTime="2026-04-16 14:00:05.705596862 +0000 UTC m=+70.861931905" Apr 16 14:00:07.696980 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:07.696887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" event={"ID":"dcf1dd91-6854-4854-9c39-44ac8bf04253","Type":"ContainerStarted","Data":"abaf2403fe979a435358ecffa3f21bae0840e9d55844f9ae968930b810d46d9c"} Apr 16 14:00:07.696980 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:07.696922 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" event={"ID":"dcf1dd91-6854-4854-9c39-44ac8bf04253","Type":"ContainerStarted","Data":"27064eb9b8f8d4e2598d5def7ae41812c53bd7bfddb9b618448bd3f81c7f6cae"} Apr 16 14:00:32.146685 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:32.146642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 14:00:32.147087 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:32.146701 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 14:00:32.147087 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:32.146798 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:32.147087 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:32.146803 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:32.147087 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:32.146861 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls podName:89be9464-73dc-4031-a7c8-03fa1b9164f2 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:36.146846834 +0000 UTC m=+161.303181853 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls") pod "dns-default-nbj9l" (UID: "89be9464-73dc-4031-a7c8-03fa1b9164f2") : secret "dns-default-metrics-tls" not found Apr 16 14:00:32.147087 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:00:32.146883 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert podName:0707b413-706b-4c25-9e10-ea274017e762 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:36.146868296 +0000 UTC m=+161.303203320 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert") pod "ingress-canary-4mk8h" (UID: "0707b413-706b-4c25-9e10-ea274017e762") : secret "canary-serving-cert" not found Apr 16 14:00:36.695810 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:36.695778 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bct9b" Apr 16 14:00:36.713679 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:36.713629 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" podStartSLOduration=30.3269365 podStartE2EDuration="38.713612512s" podCreationTimestamp="2026-04-16 13:59:58 +0000 UTC" firstStartedPulling="2026-04-16 13:59:59.018997081 +0000 UTC m=+64.175332105" lastFinishedPulling="2026-04-16 14:00:07.405673099 +0000 UTC m=+72.562008117" observedRunningTime="2026-04-16 14:00:07.714084292 +0000 UTC m=+72.870419332" watchObservedRunningTime="2026-04-16 14:00:36.713612512 +0000 UTC m=+101.869947553" Apr 16 14:00:56.857086 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:56.857057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lnsdn_9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d/dns-node-resolver/0.log" Apr 16 14:00:57.456991 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:00:57.456960 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5p4kh_e11c6740-8d55-4673-8d82-f90f6a93b413/node-ca/0.log" Apr 16 14:01:04.273969 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:04.273915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 14:01:04.274509 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:04.274057 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:01:04.274509 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:04.274124 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs podName:91ffb15b-8d84-4a65-a157-65c7adaca0ea nodeName:}" failed. No retries permitted until 2026-04-16 14:03:06.274108539 +0000 UTC m=+251.430443558 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs") pod "network-metrics-daemon-6bp8d" (UID: "91ffb15b-8d84-4a65-a157-65c7adaca0ea") : secret "metrics-daemon-secret" not found Apr 16 14:01:08.114296 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.114261 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pfcp5"] Apr 16 14:01:08.117472 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.117438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.119918 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.119897 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lstgd\"" Apr 16 14:01:08.120675 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.120657 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:01:08.120797 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.120686 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:01:08.120797 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.120706 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:01:08.120797 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.120746 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:01:08.128469 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.128437 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pfcp5"] Apr 16 14:01:08.202181 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.202128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9wgd\" (UniqueName: \"kubernetes.io/projected/e6fafb01-43fc-4828-8367-8e3b641523ae-kube-api-access-l9wgd\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.202389 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.202202 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.202389 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.202240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6fafb01-43fc-4828-8367-8e3b641523ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.202389 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.202262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6fafb01-43fc-4828-8367-8e3b641523ae-crio-socket\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.202389 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.202284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6fafb01-43fc-4828-8367-8e3b641523ae-data-volume\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.303298 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.303247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9wgd\" (UniqueName: \"kubernetes.io/projected/e6fafb01-43fc-4828-8367-8e3b641523ae-kube-api-access-l9wgd\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.303298 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.303306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.303564 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.303336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6fafb01-43fc-4828-8367-8e3b641523ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.303564 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.303359 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6fafb01-43fc-4828-8367-8e3b641523ae-crio-socket\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.303564 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.303378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6fafb01-43fc-4828-8367-8e3b641523ae-data-volume\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.303564 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.303499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6fafb01-43fc-4828-8367-8e3b641523ae-crio-socket\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.303564 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:08.303534 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:01:08.303742 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:08.303627 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls podName:e6fafb01-43fc-4828-8367-8e3b641523ae nodeName:}" failed. No retries permitted until 2026-04-16 14:01:08.803604308 +0000 UTC m=+133.959939329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls") pod "insights-runtime-extractor-pfcp5" (UID: "e6fafb01-43fc-4828-8367-8e3b641523ae") : secret "insights-runtime-extractor-tls" not found Apr 16 14:01:08.303742 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.303689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6fafb01-43fc-4828-8367-8e3b641523ae-data-volume\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.303903 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.303883 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6fafb01-43fc-4828-8367-8e3b641523ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.312536 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.312502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9wgd\" (UniqueName: \"kubernetes.io/projected/e6fafb01-43fc-4828-8367-8e3b641523ae-kube-api-access-l9wgd\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.806628 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:08.806588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:08.806808 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:08.806740 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:01:08.806850 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:08.806816 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls podName:e6fafb01-43fc-4828-8367-8e3b641523ae nodeName:}" failed. No retries permitted until 2026-04-16 14:01:09.806799585 +0000 UTC m=+134.963134603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls") pod "insights-runtime-extractor-pfcp5" (UID: "e6fafb01-43fc-4828-8367-8e3b641523ae") : secret "insights-runtime-extractor-tls" not found Apr 16 14:01:09.813907 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:09.813874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:09.814395 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:09.814034 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:01:09.814395 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:09.814100 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls podName:e6fafb01-43fc-4828-8367-8e3b641523ae nodeName:}" failed. No retries permitted until 2026-04-16 14:01:11.814074836 +0000 UTC m=+136.970409854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls") pod "insights-runtime-extractor-pfcp5" (UID: "e6fafb01-43fc-4828-8367-8e3b641523ae") : secret "insights-runtime-extractor-tls" not found Apr 16 14:01:11.829149 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:11.829117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:11.829548 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:11.829273 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:01:11.829548 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:11.829340 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls podName:e6fafb01-43fc-4828-8367-8e3b641523ae nodeName:}" failed. No retries permitted until 2026-04-16 14:01:15.829318136 +0000 UTC m=+140.985653155 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls") pod "insights-runtime-extractor-pfcp5" (UID: "e6fafb01-43fc-4828-8367-8e3b641523ae") : secret "insights-runtime-extractor-tls" not found Apr 16 14:01:15.858701 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:15.858660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:15.860908 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:15.860886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6fafb01-43fc-4828-8367-8e3b641523ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pfcp5\" (UID: \"e6fafb01-43fc-4828-8367-8e3b641523ae\") " pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:15.926101 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:15.926053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pfcp5" Apr 16 14:01:16.044858 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:16.044827 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pfcp5"] Apr 16 14:01:16.048240 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:01:16.048212 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6fafb01_43fc_4828_8367_8e3b641523ae.slice/crio-5041815bfd02f8d768ed0e97e665237aeff4b54c0371a96a161692eed4440013 WatchSource:0}: Error finding container 5041815bfd02f8d768ed0e97e665237aeff4b54c0371a96a161692eed4440013: Status 404 returned error can't find the container with id 5041815bfd02f8d768ed0e97e665237aeff4b54c0371a96a161692eed4440013 Apr 16 14:01:16.833016 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:16.832983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pfcp5" event={"ID":"e6fafb01-43fc-4828-8367-8e3b641523ae","Type":"ContainerStarted","Data":"45f2db4d13be78daf50af2b012d6a3d140b0a903de343c0afec250d0f3173d7e"} Apr 16 14:01:16.833016 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:16.833019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pfcp5" event={"ID":"e6fafb01-43fc-4828-8367-8e3b641523ae","Type":"ContainerStarted","Data":"5041815bfd02f8d768ed0e97e665237aeff4b54c0371a96a161692eed4440013"} Apr 16 14:01:17.837424 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:17.837382 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pfcp5" event={"ID":"e6fafb01-43fc-4828-8367-8e3b641523ae","Type":"ContainerStarted","Data":"f0f4269133b4301524574f570d4b1a8143171c004658608d132ee52ebbe3fb17"} Apr 16 14:01:18.841969 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:18.841939 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pfcp5" event={"ID":"e6fafb01-43fc-4828-8367-8e3b641523ae","Type":"ContainerStarted","Data":"2537ac64f2782d595a14e22034d66bd585bdee8c2caf53c6795212b26defebde"} Apr 16 14:01:18.857880 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:18.857784 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pfcp5" podStartSLOduration=8.599698625 podStartE2EDuration="10.857769707s" podCreationTimestamp="2026-04-16 14:01:08 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.103074538 +0000 UTC m=+141.259409557" lastFinishedPulling="2026-04-16 14:01:18.36114562 +0000 UTC m=+143.517480639" observedRunningTime="2026-04-16 14:01:18.857357983 +0000 UTC m=+144.013693021" watchObservedRunningTime="2026-04-16 14:01:18.857769707 +0000 UTC m=+144.014104749" Apr 16 14:01:26.747675 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.747517 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c4c8c4dc6-5btcw"] Apr 16 14:01:26.750631 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.750615 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.755723 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.755702 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:01:26.756195 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.756173 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gdggq\"" Apr 16 14:01:26.756621 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.756413 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:01:26.756621 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.756414 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:01:26.766973 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.766953 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:01:26.776547 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.776395 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c4c8c4dc6-5btcw"] Apr 16 14:01:26.842610 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.842575 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-installation-pull-secrets\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.842610 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.842623 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-registry-certificates\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.842843 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.842642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vst25\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-kube-api-access-vst25\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.842843 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.842666 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-registry-tls\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.842843 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.842686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-trusted-ca\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.842843 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.842702 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-bound-sa-token\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.842843 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.842726 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-ca-trust-extracted\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.842843 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.842810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-image-registry-private-configuration\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.943642 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.943605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-installation-pull-secrets\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.943831 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.943656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-registry-certificates\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.943831 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.943685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vst25\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-kube-api-access-vst25\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.943831 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.943711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-registry-tls\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.943831 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.943735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-trusted-ca\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.943831 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.943757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-bound-sa-token\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.943831 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.943789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-ca-trust-extracted\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.944132 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.943842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-image-registry-private-configuration\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.944353 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.944316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-ca-trust-extracted\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.944723 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.944703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-registry-certificates\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.944835 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.944813 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-trusted-ca\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.946343 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.946314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-image-registry-private-configuration\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.946418 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.946345 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-registry-tls\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.946476 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.946411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-installation-pull-secrets\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.951297 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.951276 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-bound-sa-token\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:26.951697 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:26.951669 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vst25\" (UniqueName: \"kubernetes.io/projected/ebcf4857-d3a9-4b7e-aa16-45f0fa462b95-kube-api-access-vst25\") pod \"image-registry-5c4c8c4dc6-5btcw\" (UID: \"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95\") " pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:27.059310 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:27.059271 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:27.188531 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:27.188501 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c4c8c4dc6-5btcw"] Apr 16 14:01:27.192130 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:01:27.192101 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebcf4857_d3a9_4b7e_aa16_45f0fa462b95.slice/crio-84da238a10535e9f78226d3053193ca6fedfea815a4bb310f951c74fba82c08d WatchSource:0}: Error finding container 84da238a10535e9f78226d3053193ca6fedfea815a4bb310f951c74fba82c08d: Status 404 returned error can't find the container with id 84da238a10535e9f78226d3053193ca6fedfea815a4bb310f951c74fba82c08d Apr 16 14:01:27.862983 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:27.862943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" event={"ID":"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95","Type":"ContainerStarted","Data":"a82e71a74ccf7a7c9d6d4671197d6dbaf22503d93e292fcd6062f9ca0f6101f2"} Apr 16 14:01:27.862983 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:27.862986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" event={"ID":"ebcf4857-d3a9-4b7e-aa16-45f0fa462b95","Type":"ContainerStarted","Data":"84da238a10535e9f78226d3053193ca6fedfea815a4bb310f951c74fba82c08d"} Apr 16 14:01:27.863482 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:27.863087 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:27.881279 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:27.881231 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" podStartSLOduration=1.8812185609999998 podStartE2EDuration="1.881218561s" podCreationTimestamp="2026-04-16 14:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:27.879994579 +0000 UTC m=+153.036329620" watchObservedRunningTime="2026-04-16 14:01:27.881218561 +0000 UTC m=+153.037553602" Apr 16 14:01:31.267942 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:31.267903 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-nbj9l" podUID="89be9464-73dc-4031-a7c8-03fa1b9164f2" Apr 16 14:01:31.275124 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:31.275091 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4mk8h" podUID="0707b413-706b-4c25-9e10-ea274017e762" Apr 16 14:01:31.874204 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:31.874169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 14:01:31.874390 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:31.874169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nbj9l" Apr 16 14:01:32.480372 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:01:32.480332 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6bp8d" podUID="91ffb15b-8d84-4a65-a157-65c7adaca0ea" Apr 16 14:01:35.233551 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.233510 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kf5nw"] Apr 16 14:01:35.238666 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.238641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.240841 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.240815 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:35.240973 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.240820 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:35.241784 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.241760 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:35.241895 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.241800 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nv4cs\"" Apr 16 14:01:35.241895 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.241804 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:35.241895 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.241816 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:35.242061 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.241902 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:35.303554 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303520 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-textfile\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.303554 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-root\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.303766 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303593 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-wtmp\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.303766 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-metrics-client-ca\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.303766 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jd5\" (UniqueName: \"kubernetes.io/projected/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-kube-api-access-n8jd5\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.303766 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303655 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.303766 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303687 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-accelerators-collector-config\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.303766 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-tls\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.303968 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.303784 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-sys\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405040 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-textfile\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405040 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-root\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405272 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-wtmp\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405272 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-metrics-client-ca\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405272 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jd5\" (UniqueName: \"kubernetes.io/projected/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-kube-api-access-n8jd5\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405272 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-root\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405272 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405272 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-wtmp\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405616 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-accelerators-collector-config\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405616 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-tls\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405616 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-sys\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405616 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-textfile\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405616 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-sys\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405804 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-metrics-client-ca\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.405857 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.405821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-accelerators-collector-config\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.407447 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.407419 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.407593 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.407566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-node-exporter-tls\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.416434 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.416414 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jd5\" (UniqueName: \"kubernetes.io/projected/6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d-kube-api-access-n8jd5\") pod \"node-exporter-kf5nw\" (UID: \"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d\") " pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.548337 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.548297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kf5nw" Apr 16 14:01:35.555508 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:01:35.555483 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e8b8ab9_6e8e_4e1d_a1ab_e299696bdd3d.slice/crio-cc2349a3be2868de49cf05c7aa488a530cc6b3634ffbb028f3f41507df22b842 WatchSource:0}: Error finding container cc2349a3be2868de49cf05c7aa488a530cc6b3634ffbb028f3f41507df22b842: Status 404 returned error can't find the container with id cc2349a3be2868de49cf05c7aa488a530cc6b3634ffbb028f3f41507df22b842 Apr 16 14:01:35.884050 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:35.883958 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kf5nw" event={"ID":"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d","Type":"ContainerStarted","Data":"cc2349a3be2868de49cf05c7aa488a530cc6b3634ffbb028f3f41507df22b842"} Apr 16 14:01:36.210610 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.210503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 14:01:36.210610 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.210581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 14:01:36.212801 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.212777 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89be9464-73dc-4031-a7c8-03fa1b9164f2-metrics-tls\") pod \"dns-default-nbj9l\" (UID: \"89be9464-73dc-4031-a7c8-03fa1b9164f2\") " pod="openshift-dns/dns-default-nbj9l" Apr 16 14:01:36.212955 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.212932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0707b413-706b-4c25-9e10-ea274017e762-cert\") pod \"ingress-canary-4mk8h\" (UID: \"0707b413-706b-4c25-9e10-ea274017e762\") " pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 14:01:36.377417 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.377387 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2c9fp\"" Apr 16 14:01:36.377417 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.377387 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wz4rp\"" Apr 16 14:01:36.385233 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.385205 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mk8h" Apr 16 14:01:36.385233 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.385227 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nbj9l" Apr 16 14:01:36.511328 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.511296 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4mk8h"] Apr 16 14:01:36.514278 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:01:36.514252 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0707b413_706b_4c25_9e10_ea274017e762.slice/crio-8c65dccae4b4dc809bd900cf51f8615a2680848b3da9f0bbbee897942df0dadb WatchSource:0}: Error finding container 8c65dccae4b4dc809bd900cf51f8615a2680848b3da9f0bbbee897942df0dadb: Status 404 returned error can't find the container with id 8c65dccae4b4dc809bd900cf51f8615a2680848b3da9f0bbbee897942df0dadb Apr 16 14:01:36.528486 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.528441 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nbj9l"] Apr 16 14:01:36.531255 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:01:36.531228 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89be9464_73dc_4031_a7c8_03fa1b9164f2.slice/crio-696b1bc302f31e633ec575cf372d3fdbe36715862677ec8f73216cc1cd8350e8 WatchSource:0}: Error finding container 696b1bc302f31e633ec575cf372d3fdbe36715862677ec8f73216cc1cd8350e8: Status 404 returned error can't find the container with id 696b1bc302f31e633ec575cf372d3fdbe36715862677ec8f73216cc1cd8350e8 Apr 16 14:01:36.886813 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.886771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nbj9l" event={"ID":"89be9464-73dc-4031-a7c8-03fa1b9164f2","Type":"ContainerStarted","Data":"696b1bc302f31e633ec575cf372d3fdbe36715862677ec8f73216cc1cd8350e8"} Apr 16 14:01:36.887620 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:36.887603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4mk8h" event={"ID":"0707b413-706b-4c25-9e10-ea274017e762","Type":"ContainerStarted","Data":"8c65dccae4b4dc809bd900cf51f8615a2680848b3da9f0bbbee897942df0dadb"} Apr 16 14:01:38.882483 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:38.882420 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" podUID="dcf1dd91-6854-4854-9c39-44ac8bf04253" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:39.896193 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:39.896157 2572 generic.go:358] "Generic (PLEG): container finished" podID="6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d" containerID="f5755243eb1d4af8b0df48d782a44c3197047b1738e2fc867794c7761dc55f57" exitCode=0 Apr 16 14:01:39.896767 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:39.896226 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kf5nw" event={"ID":"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d","Type":"ContainerDied","Data":"f5755243eb1d4af8b0df48d782a44c3197047b1738e2fc867794c7761dc55f57"} Apr 16 14:01:39.897641 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:39.897613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4mk8h" event={"ID":"0707b413-706b-4c25-9e10-ea274017e762","Type":"ContainerStarted","Data":"c417a9a7677bf29cab92fefe5702067b5488a35ccce53be9c46caad035f0c1b3"} Apr 16 14:01:39.899073 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:39.899037 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nbj9l" event={"ID":"89be9464-73dc-4031-a7c8-03fa1b9164f2","Type":"ContainerStarted","Data":"700094802bfca4ac82c5647700d86a8b65b900559c5abcf05e925385a20c7e7b"} Apr 16 14:01:39.899073 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:39.899069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nbj9l" event={"ID":"89be9464-73dc-4031-a7c8-03fa1b9164f2","Type":"ContainerStarted","Data":"d914f15dbcdd680ad54f9181027144823a4dacb4d9cd9a65fcfe71fddb38d3ce"} Apr 16 14:01:39.899210 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:39.899195 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nbj9l" Apr 16 14:01:39.926135 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:39.926092 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4mk8h" podStartSLOduration=129.552340951 podStartE2EDuration="2m11.92607583s" podCreationTimestamp="2026-04-16 13:59:28 +0000 UTC" firstStartedPulling="2026-04-16 14:01:36.516519239 +0000 UTC m=+161.672854274" lastFinishedPulling="2026-04-16 14:01:38.89025413 +0000 UTC m=+164.046589153" observedRunningTime="2026-04-16 14:01:39.925879742 +0000 UTC m=+165.082214785" watchObservedRunningTime="2026-04-16 14:01:39.92607583 +0000 UTC m=+165.082410867" Apr 16 14:01:39.942774 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:39.942733 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nbj9l" podStartSLOduration=129.591735715 podStartE2EDuration="2m11.942717245s" podCreationTimestamp="2026-04-16 13:59:28 +0000 UTC" firstStartedPulling="2026-04-16 14:01:36.533118111 +0000 UTC m=+161.689453130" lastFinishedPulling="2026-04-16 14:01:38.884099638 +0000 UTC m=+164.040434660" observedRunningTime="2026-04-16 14:01:39.941987866 +0000 UTC m=+165.098322908" watchObservedRunningTime="2026-04-16 14:01:39.942717245 +0000 UTC m=+165.099052285" Apr 16 14:01:40.903137 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:40.903097 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kf5nw" event={"ID":"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d","Type":"ContainerStarted","Data":"70dee4caf53a10749cc265ac08e891001db048bb66573e197aba3228750db524"} Apr 16 14:01:40.903137 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:40.903143 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kf5nw" event={"ID":"6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d","Type":"ContainerStarted","Data":"dd970673ab25568d01f7926308a47acb57a3d71a9663a06767d9095f7afa5059"} Apr 16 14:01:40.923242 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:40.923194 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kf5nw" podStartSLOduration=2.59620845 podStartE2EDuration="5.923181192s" podCreationTimestamp="2026-04-16 14:01:35 +0000 UTC" firstStartedPulling="2026-04-16 14:01:35.557175346 +0000 UTC m=+160.713510366" lastFinishedPulling="2026-04-16 14:01:38.884148086 +0000 UTC m=+164.040483108" observedRunningTime="2026-04-16 14:01:40.92182295 +0000 UTC m=+166.078157992" watchObservedRunningTime="2026-04-16 14:01:40.923181192 +0000 UTC m=+166.079516234" Apr 16 14:01:41.433997 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.433960 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:41.437564 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.437546 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.439845 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.439820 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:01:41.439971 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.439928 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-h9485\"" Apr 16 14:01:41.439971 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.439936 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:01:41.440087 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440050 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:01:41.440144 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440089 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:01:41.440828 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440810 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:01:41.440828 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440817 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:01:41.440968 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440840 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4p1ab1c242cvl\"" Apr 16 14:01:41.440968 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440847 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:01:41.440968 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440945 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:01:41.441118 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440971 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:01:41.441118 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.440974 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:01:41.441923 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.441908 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:01:41.445084 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.444908 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:01:41.449185 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.449165 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:01:41.454982 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.454938 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:41.461550 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.461445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.461750 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.461678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.461750 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.461719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.461961 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.461899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.461961 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.461930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.462168 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462109 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4xk\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-kube-api-access-9c4xk\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.462168 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.462364 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.462364 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462342 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.462561 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462532 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.462718 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-config-out\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.462872 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-web-config\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.462872 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.463076 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.462858 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.463076 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.463037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.463276 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.463205 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-config\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.463276 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.463237 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.463478 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.463425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564300 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564300 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-config\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564592 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564592 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564592 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564592 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564592 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564592 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564592 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564592 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4xk\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-kube-api-access-9c4xk\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-config-out\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-web-config\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.564977 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.564811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.565419 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.565274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.565419 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.565325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.565917 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.565889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.567593 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.567567 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.567702 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.567598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.567702 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.567613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.568261 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.568234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.568943 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.568667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-config\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.568943 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.568728 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.568943 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.568736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-config-out\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.569212 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.569131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.569432 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.569407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.569911 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.569887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-web-config\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.570090 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.570075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.570184 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.570167 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.570699 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.570681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.576403 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.576381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4xk\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-kube-api-access-9c4xk\") pod \"prometheus-k8s-0\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.749353 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.749263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:41.872491 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.872449 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:41.874684 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:01:41.874663 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66b8c19b_3106_4406_9560_0119c0f4fb65.slice/crio-6a28d61605b4c71b34ee414196ae8ef766c61d4b9f7dfc1aff49342c021daf29 WatchSource:0}: Error finding container 6a28d61605b4c71b34ee414196ae8ef766c61d4b9f7dfc1aff49342c021daf29: Status 404 returned error can't find the container with id 6a28d61605b4c71b34ee414196ae8ef766c61d4b9f7dfc1aff49342c021daf29 Apr 16 14:01:41.906774 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:41.906738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerStarted","Data":"6a28d61605b4c71b34ee414196ae8ef766c61d4b9f7dfc1aff49342c021daf29"} Apr 16 14:01:43.914100 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:43.914060 2572 generic.go:358] "Generic (PLEG): container finished" podID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" exitCode=0 Apr 16 14:01:43.914584 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:43.914125 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerDied","Data":"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8"} Apr 16 14:01:46.463325 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:46.463231 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 14:01:46.924234 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:46.924201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerStarted","Data":"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b"} Apr 16 14:01:46.924234 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:46.924238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerStarted","Data":"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2"} Apr 16 14:01:47.063508 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:47.063448 2572 patch_prober.go:28] interesting pod/image-registry-5c4c8c4dc6-5btcw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:47.063662 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:47.063527 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" podUID="ebcf4857-d3a9-4b7e-aa16-45f0fa462b95" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:48.869796 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:48.869767 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5c4c8c4dc6-5btcw" Apr 16 14:01:48.882433 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:48.882399 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" podUID="dcf1dd91-6854-4854-9c39-44ac8bf04253" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:48.932886 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:48.932854 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerStarted","Data":"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163"} Apr 16 14:01:48.933051 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:48.932891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerStarted","Data":"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7"} Apr 16 14:01:48.933051 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:48.932907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerStarted","Data":"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3"} Apr 16 14:01:48.933051 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:48.932919 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerStarted","Data":"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4"} Apr 16 14:01:48.961448 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:48.961396 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.58126813 podStartE2EDuration="7.961374118s" podCreationTimestamp="2026-04-16 14:01:41 +0000 UTC" firstStartedPulling="2026-04-16 14:01:41.876612513 +0000 UTC m=+167.032947532" lastFinishedPulling="2026-04-16 14:01:48.256718498 +0000 UTC m=+173.413053520" observedRunningTime="2026-04-16 14:01:48.959845266 +0000 UTC m=+174.116180328" watchObservedRunningTime="2026-04-16 14:01:48.961374118 +0000 UTC m=+174.117709159" Apr 16 14:01:49.905199 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:49.905163 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nbj9l" Apr 16 14:01:51.750385 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:51.750348 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:58.882547 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:58.882510 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" podUID="dcf1dd91-6854-4854-9c39-44ac8bf04253" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:58.882910 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:58.882578 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" Apr 16 14:01:58.883192 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:58.883159 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"abaf2403fe979a435358ecffa3f21bae0840e9d55844f9ae968930b810d46d9c"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:01:58.883247 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:58.883231 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" podUID="dcf1dd91-6854-4854-9c39-44ac8bf04253" containerName="service-proxy" containerID="cri-o://abaf2403fe979a435358ecffa3f21bae0840e9d55844f9ae968930b810d46d9c" gracePeriod=30 Apr 16 14:01:59.962276 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:59.962233 2572 generic.go:358] "Generic (PLEG): container finished" podID="dcf1dd91-6854-4854-9c39-44ac8bf04253" containerID="abaf2403fe979a435358ecffa3f21bae0840e9d55844f9ae968930b810d46d9c" exitCode=2 Apr 16 14:01:59.962276 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:59.962279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" event={"ID":"dcf1dd91-6854-4854-9c39-44ac8bf04253","Type":"ContainerDied","Data":"abaf2403fe979a435358ecffa3f21bae0840e9d55844f9ae968930b810d46d9c"} Apr 16 14:01:59.962701 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:01:59.962305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65f8869c57-w9xpw" event={"ID":"dcf1dd91-6854-4854-9c39-44ac8bf04253","Type":"ContainerStarted","Data":"0579c57cf877960861748bea5a4592515f572bfcb030e6d1196b8988b74eb41a"} Apr 16 14:02:41.749794 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:41.749749 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:41.768007 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:41.767985 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:42.082607 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:42.082580 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:59.833884 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:59.833848 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:02:59.834399 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:59.834341 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="prometheus" containerID="cri-o://da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" gracePeriod=600 Apr 16 14:02:59.834537 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:59.834385 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="thanos-sidecar" containerID="cri-o://07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" gracePeriod=600 Apr 16 14:02:59.834672 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:59.834512 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy-web" containerID="cri-o://c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" gracePeriod=600 Apr 16 14:02:59.834672 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:59.834481 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="config-reloader" containerID="cri-o://94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" gracePeriod=600 Apr 16 14:02:59.834784 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:59.834675 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy" containerID="cri-o://c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" gracePeriod=600 Apr 16 14:02:59.835535 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:02:59.834889 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy-thanos" containerID="cri-o://5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" gracePeriod=600 Apr 16 14:03:00.061543 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.061517 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.113665 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113595 2572 generic.go:358] "Generic (PLEG): container finished" podID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" exitCode=0 Apr 16 14:03:00.113665 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113618 2572 generic.go:358] "Generic (PLEG): container finished" podID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" exitCode=0 Apr 16 14:03:00.113665 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113625 2572 generic.go:358] "Generic (PLEG): container finished" podID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" exitCode=0 Apr 16 14:03:00.113665 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113630 2572 generic.go:358] "Generic (PLEG): container finished" podID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" exitCode=0 Apr 16 14:03:00.113665 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113635 2572 generic.go:358] "Generic (PLEG): container finished" podID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" exitCode=0 Apr 16 14:03:00.113665 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113640 2572 generic.go:358] "Generic (PLEG): container finished" podID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" exitCode=0 Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerDied","Data":"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163"} Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113721 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerDied","Data":"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7"} Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerDied","Data":"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3"} Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerDied","Data":"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4"} Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113750 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerDied","Data":"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b"} Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113756 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113772 2572 scope.go:117] "RemoveContainer" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113758 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerDied","Data":"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2"} Apr 16 14:03:00.113954 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.113856 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"66b8c19b-3106-4406-9560-0119c0f4fb65","Type":"ContainerDied","Data":"6a28d61605b4c71b34ee414196ae8ef766c61d4b9f7dfc1aff49342c021daf29"} Apr 16 14:03:00.122224 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.122209 2572 scope.go:117] "RemoveContainer" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" Apr 16 14:03:00.128438 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.128420 2572 scope.go:117] "RemoveContainer" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" Apr 16 14:03:00.134319 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.134300 2572 scope.go:117] "RemoveContainer" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" Apr 16 14:03:00.140426 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.140410 2572 scope.go:117] "RemoveContainer" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" Apr 16 14:03:00.146008 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.145992 2572 scope.go:117] "RemoveContainer" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" Apr 16 14:03:00.152964 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.152947 2572 scope.go:117] "RemoveContainer" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" Apr 16 14:03:00.158645 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.158630 2572 scope.go:117] "RemoveContainer" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" Apr 16 14:03:00.158883 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:03:00.158863 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": container with ID starting with 5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163 not found: ID does not exist" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" Apr 16 14:03:00.158958 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.158891 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163"} err="failed to get container status \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": rpc error: code = NotFound desc = could not find container \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": container with ID starting with 5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163 not found: ID does not exist" Apr 16 14:03:00.158958 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.158910 2572 scope.go:117] "RemoveContainer" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" Apr 16 14:03:00.159130 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:03:00.159115 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": container with ID starting with c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7 not found: ID does not exist" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" Apr 16 14:03:00.159183 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.159135 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7"} err="failed to get container status \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": rpc error: code = NotFound desc = could not find container \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": container with ID starting with c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7 not found: ID does not exist" Apr 16 14:03:00.159183 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.159150 2572 scope.go:117] "RemoveContainer" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" Apr 16 14:03:00.159356 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:03:00.159342 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": container with ID starting with c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3 not found: ID does not exist" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" Apr 16 14:03:00.159398 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.159359 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3"} err="failed to get container status \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": rpc error: code = NotFound desc = could not find container \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": container with ID starting with c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3 not found: ID does not exist" Apr 16 14:03:00.159398 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.159372 2572 scope.go:117] "RemoveContainer" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" Apr 16 14:03:00.159786 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:03:00.159765 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": container with ID starting with 07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4 not found: ID does not exist" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" Apr 16 14:03:00.159834 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.159791 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4"} err="failed to get container status \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": rpc error: code = NotFound desc = could not find container \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": container with ID starting with 07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4 not found: ID does not exist" Apr 16 14:03:00.159834 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.159804 2572 scope.go:117] "RemoveContainer" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" Apr 16 14:03:00.160038 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:03:00.160021 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": container with ID starting with 94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b not found: ID does not exist" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" Apr 16 14:03:00.160080 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160052 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b"} err="failed to get container status \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": rpc error: code = NotFound desc = could not find container \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": container with ID starting with 94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b not found: ID does not exist" Apr 16 14:03:00.160080 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160066 2572 scope.go:117] "RemoveContainer" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" Apr 16 14:03:00.160275 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:03:00.160260 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": container with ID starting with da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2 not found: ID does not exist" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" Apr 16 14:03:00.160326 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160279 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2"} err="failed to get container status \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": rpc error: code = NotFound desc = could not find container \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": container with ID starting with da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2 not found: ID does not exist" Apr 16 14:03:00.160326 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160292 2572 scope.go:117] "RemoveContainer" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" Apr 16 14:03:00.160488 ip-10-0-140-59 kubenswrapper[2572]: E0416 14:03:00.160474 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": container with ID starting with a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8 not found: ID does not exist" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" Apr 16 14:03:00.160535 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160492 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8"} err="failed to get container status \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": rpc error: code = NotFound desc = could not find container \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": container with ID starting with a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8 not found: ID does not exist" Apr 16 14:03:00.160535 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160503 2572 scope.go:117] "RemoveContainer" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" Apr 16 14:03:00.160727 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160707 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163"} err="failed to get container status \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": rpc error: code = NotFound desc = could not find container \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": container with ID starting with 5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163 not found: ID does not exist" Apr 16 14:03:00.160727 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160727 2572 scope.go:117] "RemoveContainer" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" Apr 16 14:03:00.160905 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160888 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7"} err="failed to get container status \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": rpc error: code = NotFound desc = could not find container \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": container with ID starting with c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7 not found: ID does not exist" Apr 16 14:03:00.160950 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.160908 2572 scope.go:117] "RemoveContainer" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" Apr 16 14:03:00.161084 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161068 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3"} err="failed to get container status \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": rpc error: code = NotFound desc = could not find container \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": container with ID starting with c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3 not found: ID does not exist" Apr 16 14:03:00.161124 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161084 2572 scope.go:117] "RemoveContainer" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" Apr 16 14:03:00.161256 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161243 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4"} err="failed to get container status \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": rpc error: code = NotFound desc = could not find container \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": container with ID starting with 07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4 not found: ID does not exist" Apr 16 14:03:00.161296 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161256 2572 scope.go:117] "RemoveContainer" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" Apr 16 14:03:00.161434 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161419 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b"} err="failed to get container status \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": rpc error: code = NotFound desc = could not find container \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": container with ID starting with 94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b not found: ID does not exist" Apr 16 14:03:00.161523 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161435 2572 scope.go:117] "RemoveContainer" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" Apr 16 14:03:00.161644 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161628 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2"} err="failed to get container status \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": rpc error: code = NotFound desc = could not find container \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": container with ID starting with da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2 not found: ID does not exist" Apr 16 14:03:00.161680 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161644 2572 scope.go:117] "RemoveContainer" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" Apr 16 14:03:00.161824 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161808 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8"} err="failed to get container status \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": rpc error: code = NotFound desc = could not find container \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": container with ID starting with a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8 not found: ID does not exist" Apr 16 14:03:00.161866 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161824 2572 scope.go:117] "RemoveContainer" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" Apr 16 14:03:00.161999 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.161980 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163"} err="failed to get container status \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": rpc error: code = NotFound desc = could not find container \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": container with ID starting with 5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163 not found: ID does not exist" Apr 16 14:03:00.162040 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162000 2572 scope.go:117] "RemoveContainer" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" Apr 16 14:03:00.162217 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162201 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7"} err="failed to get container status \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": rpc error: code = NotFound desc = could not find container \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": container with ID starting with c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7 not found: ID does not exist" Apr 16 14:03:00.162217 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162216 2572 scope.go:117] "RemoveContainer" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" Apr 16 14:03:00.162415 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162400 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3"} err="failed to get container status \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": rpc error: code = NotFound desc = could not find container \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": container with ID starting with c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3 not found: ID does not exist" Apr 16 14:03:00.162415 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162415 2572 scope.go:117] "RemoveContainer" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" Apr 16 14:03:00.162655 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162637 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4"} err="failed to get container status \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": rpc error: code = NotFound desc = could not find container \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": container with ID starting with 07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4 not found: ID does not exist" Apr 16 14:03:00.162695 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162655 2572 scope.go:117] "RemoveContainer" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" Apr 16 14:03:00.162863 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162839 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b"} err="failed to get container status \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": rpc error: code = NotFound desc = could not find container \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": container with ID starting with 94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b not found: ID does not exist" Apr 16 14:03:00.162908 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.162864 2572 scope.go:117] "RemoveContainer" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" Apr 16 14:03:00.163085 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163068 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2"} err="failed to get container status \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": rpc error: code = NotFound desc = could not find container \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": container with ID starting with da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2 not found: ID does not exist" Apr 16 14:03:00.163134 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163085 2572 scope.go:117] "RemoveContainer" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" Apr 16 14:03:00.163264 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163249 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8"} err="failed to get container status \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": rpc error: code = NotFound desc = could not find container \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": container with ID starting with a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8 not found: ID does not exist" Apr 16 14:03:00.163309 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163265 2572 scope.go:117] "RemoveContainer" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" Apr 16 14:03:00.163494 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163479 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163"} err="failed to get container status \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": rpc error: code = NotFound desc = could not find container \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": container with ID starting with 5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163 not found: ID does not exist" Apr 16 14:03:00.163540 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163494 2572 scope.go:117] "RemoveContainer" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" Apr 16 14:03:00.163709 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163694 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7"} err="failed to get container status \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": rpc error: code = NotFound desc = could not find container \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": container with ID starting with c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7 not found: ID does not exist" Apr 16 14:03:00.163759 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163710 2572 scope.go:117] "RemoveContainer" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" Apr 16 14:03:00.163924 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163903 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3"} err="failed to get container status \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": rpc error: code = NotFound desc = could not find container \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": container with ID starting with c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3 not found: ID does not exist" Apr 16 14:03:00.163967 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.163924 2572 scope.go:117] "RemoveContainer" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" Apr 16 14:03:00.164125 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164107 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4"} err="failed to get container status \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": rpc error: code = NotFound desc = could not find container \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": container with ID starting with 07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4 not found: ID does not exist" Apr 16 14:03:00.164168 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164127 2572 scope.go:117] "RemoveContainer" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" Apr 16 14:03:00.164347 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164324 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b"} err="failed to get container status \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": rpc error: code = NotFound desc = could not find container \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": container with ID starting with 94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b not found: ID does not exist" Apr 16 14:03:00.164347 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164346 2572 scope.go:117] "RemoveContainer" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" Apr 16 14:03:00.164579 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164561 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2"} err="failed to get container status \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": rpc error: code = NotFound desc = could not find container \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": container with ID starting with da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2 not found: ID does not exist" Apr 16 14:03:00.164635 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164581 2572 scope.go:117] "RemoveContainer" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" Apr 16 14:03:00.164789 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164766 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8"} err="failed to get container status \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": rpc error: code = NotFound desc = could not find container \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": container with ID starting with a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8 not found: ID does not exist" Apr 16 14:03:00.164789 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164787 2572 scope.go:117] "RemoveContainer" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" Apr 16 14:03:00.164997 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164982 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163"} err="failed to get container status \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": rpc error: code = NotFound desc = could not find container \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": container with ID starting with 5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163 not found: ID does not exist" Apr 16 14:03:00.165043 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.164998 2572 scope.go:117] "RemoveContainer" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" Apr 16 14:03:00.165210 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.165195 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7"} err="failed to get container status \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": rpc error: code = NotFound desc = could not find container \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": container with ID starting with c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7 not found: ID does not exist" Apr 16 14:03:00.165210 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.165210 2572 scope.go:117] "RemoveContainer" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" Apr 16 14:03:00.165430 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.165408 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3"} err="failed to get container status \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": rpc error: code = NotFound desc = could not find container \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": container with ID starting with c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3 not found: ID does not exist" Apr 16 14:03:00.165523 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.165431 2572 scope.go:117] "RemoveContainer" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" Apr 16 14:03:00.165677 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.165662 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4"} err="failed to get container status \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": rpc error: code = NotFound desc = could not find container \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": container with ID starting with 07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4 not found: ID does not exist" Apr 16 14:03:00.165721 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.165678 2572 scope.go:117] "RemoveContainer" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" Apr 16 14:03:00.165871 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.165854 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b"} err="failed to get container status \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": rpc error: code = NotFound desc = could not find container \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": container with ID starting with 94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b not found: ID does not exist" Apr 16 14:03:00.165937 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.165874 2572 scope.go:117] "RemoveContainer" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" Apr 16 14:03:00.166067 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166053 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2"} err="failed to get container status \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": rpc error: code = NotFound desc = could not find container \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": container with ID starting with da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2 not found: ID does not exist" Apr 16 14:03:00.166121 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166067 2572 scope.go:117] "RemoveContainer" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" Apr 16 14:03:00.166253 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166240 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8"} err="failed to get container status \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": rpc error: code = NotFound desc = could not find container \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": container with ID starting with a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8 not found: ID does not exist" Apr 16 14:03:00.166298 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166253 2572 scope.go:117] "RemoveContainer" containerID="5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163" Apr 16 14:03:00.166437 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166420 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163"} err="failed to get container status \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": rpc error: code = NotFound desc = could not find container \"5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163\": container with ID starting with 5fb3bf6d636f372bc56e0d93d8d415c455d901205e7d39c658b7ab6f30f85163 not found: ID does not exist" Apr 16 14:03:00.166499 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166437 2572 scope.go:117] "RemoveContainer" containerID="c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7" Apr 16 14:03:00.166652 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166635 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7"} err="failed to get container status \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": rpc error: code = NotFound desc = could not find container \"c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7\": container with ID starting with c7b135f905064fe15a7ad8bbcefa0ebf8a0d3440b2b2c45565b6dcd9f3f891c7 not found: ID does not exist" Apr 16 14:03:00.166691 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166655 2572 scope.go:117] "RemoveContainer" containerID="c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3" Apr 16 14:03:00.166880 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166859 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3"} err="failed to get container status \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": rpc error: code = NotFound desc = could not find container \"c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3\": container with ID starting with c783e1ce73e2737e06f8b7fc83ef4f214d1b5a6c3e0b296f00930fb68f9e7dd3 not found: ID does not exist" Apr 16 14:03:00.166946 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.166881 2572 scope.go:117] "RemoveContainer" containerID="07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4" Apr 16 14:03:00.167096 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.167080 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4"} err="failed to get container status \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": rpc error: code = NotFound desc = could not find container \"07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4\": container with ID starting with 07235a15641b3f3e76ce891e826baca02698ebecf0e041ffc6f506b66b98cfc4 not found: ID does not exist" Apr 16 14:03:00.167159 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.167098 2572 scope.go:117] "RemoveContainer" containerID="94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b" Apr 16 14:03:00.167310 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.167294 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b"} err="failed to get container status \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": rpc error: code = NotFound desc = could not find container \"94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b\": container with ID starting with 94fe870631671527519556f1218104e5e05b65e2259d2eea8519c222f01da84b not found: ID does not exist" Apr 16 14:03:00.167375 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.167311 2572 scope.go:117] "RemoveContainer" containerID="da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2" Apr 16 14:03:00.167518 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.167500 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2"} err="failed to get container status \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": rpc error: code = NotFound desc = could not find container \"da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2\": container with ID starting with da1e3b840a3de66f8eec7ed55bc814ae866d209d37bb9bde2976dad15499c3c2 not found: ID does not exist" Apr 16 14:03:00.167589 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.167520 2572 scope.go:117] "RemoveContainer" containerID="a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8" Apr 16 14:03:00.167708 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.167693 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8"} err="failed to get container status \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": rpc error: code = NotFound desc = could not find container \"a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8\": container with ID starting with a0e3bb03f2299ec7700ba8f47fa3a95af1bd92a4660bedb0feb5e6742fdd49c8 not found: ID does not exist" Apr 16 14:03:00.214035 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214002 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-serving-certs-ca-bundle\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214035 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214037 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-trusted-ca-bundle\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214216 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214058 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-db\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214216 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214097 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214216 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214146 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c4xk\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-kube-api-access-9c4xk\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214216 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214174 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-config\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214216 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214203 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-tls\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214232 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-web-config\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214261 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-metrics-client-certs\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214295 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-rulefiles-0\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214326 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-kube-rbac-proxy\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214369 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-tls-assets\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214394 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-metrics-client-ca\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214422 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-grpc-tls\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214483 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-kubelet-serving-ca-bundle\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214864 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214501 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:00.214864 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214512 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:00.214864 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214511 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-config-out\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214864 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214609 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.214864 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214640 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-thanos-prometheus-http-client-file\") pod \"66b8c19b-3106-4406-9560-0119c0f4fb65\" (UID: \"66b8c19b-3106-4406-9560-0119c0f4fb65\") " Apr 16 14:03:00.215086 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214944 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.215086 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.214966 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.215379 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.215351 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:00.216512 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.216482 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:00.217636 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.217551 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-kube-api-access-9c4xk" (OuterVolumeSpecName: "kube-api-access-9c4xk") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "kube-api-access-9c4xk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:00.218021 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.217994 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.218102 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.218050 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:00.218158 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.218114 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:00.218158 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.218147 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-config" (OuterVolumeSpecName: "config") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.218401 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.218363 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.218522 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.218422 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-config-out" (OuterVolumeSpecName: "config-out") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:00.218522 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.218489 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.218753 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.218721 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.218859 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.218839 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.219266 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.219238 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.219659 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.219644 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.219872 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.219859 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:00.227344 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.227323 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-web-config" (OuterVolumeSpecName: "web-config") pod "66b8c19b-3106-4406-9560-0119c0f4fb65" (UID: "66b8c19b-3106-4406-9560-0119c0f4fb65"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:00.315386 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315331 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315386 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315380 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9c4xk\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-kube-api-access-9c4xk\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315386 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315395 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-config\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315409 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315423 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-web-config\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315434 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-metrics-client-certs\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315446 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315483 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-kube-rbac-proxy\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315495 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/66b8c19b-3106-4406-9560-0119c0f4fb65-tls-assets\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315507 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-metrics-client-ca\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315518 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-grpc-tls\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315530 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b8c19b-3106-4406-9560-0119c0f4fb65-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315544 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-config-out\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315558 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315571 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/66b8c19b-3106-4406-9560-0119c0f4fb65-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.315678 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.315584 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/66b8c19b-3106-4406-9560-0119c0f4fb65-prometheus-k8s-db\") on node \"ip-10-0-140-59.ec2.internal\" DevicePath \"\"" Apr 16 14:03:00.438032 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.437997 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:00.442052 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.442030 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:00.466685 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.466654 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:00.466943 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.466925 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="prometheus" Apr 16 14:03:00.467014 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.466947 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="prometheus" Apr 16 14:03:00.467014 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.466961 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy-web" Apr 16 14:03:00.467014 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.466970 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy-web" Apr 16 14:03:00.467014 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.466985 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="config-reloader" Apr 16 14:03:00.467014 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.466996 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="config-reloader" Apr 16 14:03:00.467014 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467013 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="init-config-reloader" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467022 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="init-config-reloader" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467030 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="thanos-sidecar" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467039 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="thanos-sidecar" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467048 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467056 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467065 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467073 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467134 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy-web" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467149 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="thanos-sidecar" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467159 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467169 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="config-reloader" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467178 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="kube-rbac-proxy" Apr 16 14:03:00.467346 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.467188 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" containerName="prometheus" Apr 16 14:03:00.472540 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.472520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.474823 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.474796 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:03:00.474912 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.474893 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:03:00.474993 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.474967 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:03:00.475219 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:03:00.475301 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475218 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:03:00.475301 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475210 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:03:00.475301 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475216 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4p1ab1c242cvl\"" Apr 16 14:03:00.475301 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475263 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:03:00.475497 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475357 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:03:00.475772 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475755 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:03:00.475876 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475856 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-h9485\"" Apr 16 14:03:00.475944 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.475881 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:03:00.476033 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.476013 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:03:00.477745 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.477718 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:03:00.479796 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.479777 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:03:00.482647 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.482625 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:00.617938 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.617898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.617938 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.617941 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618111 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.617976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618111 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618014 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618111 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-web-config\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618111 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618249 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618131 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618249 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618163 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618249 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618249 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618210 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-config\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618249 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618249 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618418 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618267 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618418 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d86c68c5-7378-477c-b14a-b446ef94407a-config-out\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618418 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618333 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtdms\" (UniqueName: \"kubernetes.io/projected/d86c68c5-7378-477c-b14a-b446ef94407a-kube-api-access-mtdms\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618418 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618349 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618418 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618364 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.618418 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.618396 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d86c68c5-7378-477c-b14a-b446ef94407a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.719642 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-web-config\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.719642 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.719642 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.719870 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.719870 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.719870 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-config\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.719870 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720069 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720069 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720069 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d86c68c5-7378-477c-b14a-b446ef94407a-config-out\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720069 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.719983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtdms\" (UniqueName: \"kubernetes.io/projected/d86c68c5-7378-477c-b14a-b446ef94407a-kube-api-access-mtdms\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720069 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720069 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720069 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d86c68c5-7378-477c-b14a-b446ef94407a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720376 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720376 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720376 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720168 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720376 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720615 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.720615 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.720562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.721524 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.721357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.721834 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.721807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.722302 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.722273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.722907 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.722865 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-web-config\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.723162 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.723145 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-config\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.723711 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.723662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.723787 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.723738 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.723787 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.723753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d86c68c5-7378-477c-b14a-b446ef94407a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.724163 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.724142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d86c68c5-7378-477c-b14a-b446ef94407a-config-out\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.724349 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.724331 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.724631 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.724609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.725212 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.725192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.725687 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.725668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d86c68c5-7378-477c-b14a-b446ef94407a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.725738 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.725719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.725777 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.725743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d86c68c5-7378-477c-b14a-b446ef94407a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.728588 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.728573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtdms\" (UniqueName: \"kubernetes.io/projected/d86c68c5-7378-477c-b14a-b446ef94407a-kube-api-access-mtdms\") pod \"prometheus-k8s-0\" (UID: \"d86c68c5-7378-477c-b14a-b446ef94407a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.782886 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.782855 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.904200 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:00.903982 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:00.906412 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:03:00.906385 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd86c68c5_7378_477c_b14a_b446ef94407a.slice/crio-68d52e04fd4e825e6242c5d20ad6ae536352b5837b596caff4b359f88a3fe9cd WatchSource:0}: Error finding container 68d52e04fd4e825e6242c5d20ad6ae536352b5837b596caff4b359f88a3fe9cd: Status 404 returned error can't find the container with id 68d52e04fd4e825e6242c5d20ad6ae536352b5837b596caff4b359f88a3fe9cd Apr 16 14:03:01.118719 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:01.118681 2572 generic.go:358] "Generic (PLEG): container finished" podID="d86c68c5-7378-477c-b14a-b446ef94407a" containerID="321fbfd4d36baf2a13e3a529d3fd987bcd23ad8fae8e005f3d69df0ba5b01737" exitCode=0 Apr 16 14:03:01.118719 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:01.118723 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d86c68c5-7378-477c-b14a-b446ef94407a","Type":"ContainerDied","Data":"321fbfd4d36baf2a13e3a529d3fd987bcd23ad8fae8e005f3d69df0ba5b01737"} Apr 16 14:03:01.118913 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:01.118746 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d86c68c5-7378-477c-b14a-b446ef94407a","Type":"ContainerStarted","Data":"68d52e04fd4e825e6242c5d20ad6ae536352b5837b596caff4b359f88a3fe9cd"} Apr 16 14:03:01.468642 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:01.468610 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b8c19b-3106-4406-9560-0119c0f4fb65" path="/var/lib/kubelet/pods/66b8c19b-3106-4406-9560-0119c0f4fb65/volumes" Apr 16 14:03:02.124945 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:02.124854 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d86c68c5-7378-477c-b14a-b446ef94407a","Type":"ContainerStarted","Data":"a7286e2efa9475f1a87156c24c101c9076d219ee07de2ea74c45b8b9f9cdb42c"} Apr 16 14:03:02.124945 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:02.124891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d86c68c5-7378-477c-b14a-b446ef94407a","Type":"ContainerStarted","Data":"d75600c659f6cf7fad95ba902c4f0883b07e3978ed73b4b563f670cc5034ec96"} Apr 16 14:03:02.124945 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:02.124900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d86c68c5-7378-477c-b14a-b446ef94407a","Type":"ContainerStarted","Data":"36e36c33d6144178002aa5927e307b5275ed0c20802e25d66b4e03c597645899"} Apr 16 14:03:02.124945 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:02.124910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d86c68c5-7378-477c-b14a-b446ef94407a","Type":"ContainerStarted","Data":"78919e11c0097d32d1c6a03f77f20ea3021f109ca00815d6863f4562b82ebb09"} Apr 16 14:03:02.124945 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:02.124918 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d86c68c5-7378-477c-b14a-b446ef94407a","Type":"ContainerStarted","Data":"ee25a7c1e222ea0a76c477f10d45e2307450bfb67af85e2c908918fc0ad039b5"} Apr 16 14:03:02.124945 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:02.124926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d86c68c5-7378-477c-b14a-b446ef94407a","Type":"ContainerStarted","Data":"4f61b77f6d924c1f482ae3c29e7a1c580434e499aec383da9f0303f45e92a205"} Apr 16 14:03:02.150948 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:02.150897 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.150882958 podStartE2EDuration="2.150882958s" podCreationTimestamp="2026-04-16 14:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:03:02.149770623 +0000 UTC m=+247.306105660" watchObservedRunningTime="2026-04-16 14:03:02.150882958 +0000 UTC m=+247.307218003" Apr 16 14:03:05.783213 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:05.783172 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.364799 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:06.364761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 14:03:06.367537 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:06.367509 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91ffb15b-8d84-4a65-a157-65c7adaca0ea-metrics-certs\") pod \"network-metrics-daemon-6bp8d\" (UID: \"91ffb15b-8d84-4a65-a157-65c7adaca0ea\") " pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 14:03:06.565981 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:06.565952 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b98gv\"" Apr 16 14:03:06.573962 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:06.573943 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6bp8d" Apr 16 14:03:06.685014 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:06.684981 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6bp8d"] Apr 16 14:03:06.689503 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:03:06.689477 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91ffb15b_8d84_4a65_a157_65c7adaca0ea.slice/crio-b5e96437260ea48bb4aafa1cd703572a4abc47ec1ed4bbcd9fc325ec544026d6 WatchSource:0}: Error finding container b5e96437260ea48bb4aafa1cd703572a4abc47ec1ed4bbcd9fc325ec544026d6: Status 404 returned error can't find the container with id b5e96437260ea48bb4aafa1cd703572a4abc47ec1ed4bbcd9fc325ec544026d6 Apr 16 14:03:07.140337 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:07.140300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6bp8d" event={"ID":"91ffb15b-8d84-4a65-a157-65c7adaca0ea","Type":"ContainerStarted","Data":"b5e96437260ea48bb4aafa1cd703572a4abc47ec1ed4bbcd9fc325ec544026d6"} Apr 16 14:03:08.144160 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:08.144121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6bp8d" event={"ID":"91ffb15b-8d84-4a65-a157-65c7adaca0ea","Type":"ContainerStarted","Data":"b552b33eccc6c5b06c81ad49fc7eecc8b6cdf11d7f13f9cd415a334a20f46bbc"} Apr 16 14:03:08.144160 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:08.144164 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6bp8d" event={"ID":"91ffb15b-8d84-4a65-a157-65c7adaca0ea","Type":"ContainerStarted","Data":"a507e423c0bb139de3cbccb365d71fa58a64d4ba9b8cd1a85463c2baee3f8ff1"} Apr 16 14:03:08.157924 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:08.157876 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6bp8d" podStartSLOduration=252.108156405 podStartE2EDuration="4m13.157860381s" podCreationTimestamp="2026-04-16 13:58:55 +0000 UTC" firstStartedPulling="2026-04-16 14:03:06.691193518 +0000 UTC m=+251.847528537" lastFinishedPulling="2026-04-16 14:03:07.740897494 +0000 UTC m=+252.897232513" observedRunningTime="2026-04-16 14:03:08.157187805 +0000 UTC m=+253.313522846" watchObservedRunningTime="2026-04-16 14:03:08.157860381 +0000 UTC m=+253.314195422" Apr 16 14:03:55.358699 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:03:55.358675 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:04:00.783997 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:04:00.783958 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:00.798868 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:04:00.798845 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:01.296883 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:04:01.296857 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:05:05.354475 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.354357 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9pcqh"] Apr 16 14:05:05.357573 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.357553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.359948 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.359925 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:05:05.365199 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.365175 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9pcqh"] Apr 16 14:05:05.420896 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.420861 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64d8850d-1d10-4767-9fc4-a4021c5694d1-dbus\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.421053 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.420903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64d8850d-1d10-4767-9fc4-a4021c5694d1-kubelet-config\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.421053 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.420992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64d8850d-1d10-4767-9fc4-a4021c5694d1-original-pull-secret\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.521956 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.521911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64d8850d-1d10-4767-9fc4-a4021c5694d1-dbus\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.522158 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.522001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64d8850d-1d10-4767-9fc4-a4021c5694d1-kubelet-config\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.522158 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.522123 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64d8850d-1d10-4767-9fc4-a4021c5694d1-dbus\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.522285 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.522204 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64d8850d-1d10-4767-9fc4-a4021c5694d1-original-pull-secret\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.522621 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.522597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64d8850d-1d10-4767-9fc4-a4021c5694d1-kubelet-config\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.524547 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.524530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64d8850d-1d10-4767-9fc4-a4021c5694d1-original-pull-secret\") pod \"global-pull-secret-syncer-9pcqh\" (UID: \"64d8850d-1d10-4767-9fc4-a4021c5694d1\") " pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.667697 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.667603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9pcqh" Apr 16 14:05:05.780780 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.780749 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9pcqh"] Apr 16 14:05:05.783972 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:05:05.783940 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d8850d_1d10_4767_9fc4_a4021c5694d1.slice/crio-476bd7e169576e1851483e49d7203a63b1ee5d05197478ab666beb54898ea95d WatchSource:0}: Error finding container 476bd7e169576e1851483e49d7203a63b1ee5d05197478ab666beb54898ea95d: Status 404 returned error can't find the container with id 476bd7e169576e1851483e49d7203a63b1ee5d05197478ab666beb54898ea95d Apr 16 14:05:05.785695 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:05.785679 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:05:06.456712 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:06.456673 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9pcqh" event={"ID":"64d8850d-1d10-4767-9fc4-a4021c5694d1","Type":"ContainerStarted","Data":"476bd7e169576e1851483e49d7203a63b1ee5d05197478ab666beb54898ea95d"} Apr 16 14:05:10.469137 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:10.469094 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9pcqh" event={"ID":"64d8850d-1d10-4767-9fc4-a4021c5694d1","Type":"ContainerStarted","Data":"252980bf8f01f5dcb16979cb1440d4c45cf2f7d72bf05a1d6d1c333bcd175252"} Apr 16 14:05:10.484378 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:05:10.484330 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9pcqh" podStartSLOduration=1.554046677 podStartE2EDuration="5.484315542s" podCreationTimestamp="2026-04-16 14:05:05 +0000 UTC" firstStartedPulling="2026-04-16 14:05:05.785809128 +0000 UTC m=+370.942144147" lastFinishedPulling="2026-04-16 14:05:09.71607798 +0000 UTC m=+374.872413012" observedRunningTime="2026-04-16 14:05:10.483160279 +0000 UTC m=+375.639495331" watchObservedRunningTime="2026-04-16 14:05:10.484315542 +0000 UTC m=+375.640650583" Apr 16 14:22:34.793619 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:34.793574 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9pcqh_64d8850d-1d10-4767-9fc4-a4021c5694d1/global-pull-secret-syncer/0.log" Apr 16 14:22:34.893107 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:34.893077 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g9npf_59f23d5d-a915-4636-b347-7d14ae37dbed/konnectivity-agent/0.log" Apr 16 14:22:35.012742 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:35.012708 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-59.ec2.internal_b1aecfc38a711305ef53a73c57dbb3d6/haproxy/0.log" Apr 16 14:22:38.955231 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:38.955201 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kf5nw_6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d/node-exporter/0.log" Apr 16 14:22:38.978958 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:38.978936 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kf5nw_6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d/kube-rbac-proxy/0.log" Apr 16 14:22:38.998689 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:38.998660 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kf5nw_6e8b8ab9-6e8e-4e1d-a1ab-e299696bdd3d/init-textfile/0.log" Apr 16 14:22:39.103342 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:39.103287 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d86c68c5-7378-477c-b14a-b446ef94407a/prometheus/0.log" Apr 16 14:22:39.123015 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:39.122963 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d86c68c5-7378-477c-b14a-b446ef94407a/config-reloader/0.log" Apr 16 14:22:39.142970 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:39.142951 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d86c68c5-7378-477c-b14a-b446ef94407a/thanos-sidecar/0.log" Apr 16 14:22:39.165594 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:39.165569 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d86c68c5-7378-477c-b14a-b446ef94407a/kube-rbac-proxy-web/0.log" Apr 16 14:22:39.188608 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:39.188589 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d86c68c5-7378-477c-b14a-b446ef94407a/kube-rbac-proxy/0.log" Apr 16 14:22:39.208293 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:39.208274 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d86c68c5-7378-477c-b14a-b446ef94407a/kube-rbac-proxy-thanos/0.log" Apr 16 14:22:39.230667 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:39.230639 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d86c68c5-7378-477c-b14a-b446ef94407a/init-config-reloader/0.log" Apr 16 14:22:41.976525 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:41.976489 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w"] Apr 16 14:22:41.978898 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:41.978878 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:41.981093 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:41.981072 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p6v84\"/\"default-dockercfg-hg4wh\"" Apr 16 14:22:41.981202 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:41.981112 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p6v84\"/\"kube-root-ca.crt\"" Apr 16 14:22:41.981855 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:41.981838 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p6v84\"/\"openshift-service-ca.crt\"" Apr 16 14:22:41.986351 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:41.986332 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w"] Apr 16 14:22:42.055488 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.055432 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-lib-modules\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.055648 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.055520 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54pn2\" (UniqueName: \"kubernetes.io/projected/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-kube-api-access-54pn2\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.055648 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.055574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-podres\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.055648 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.055626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-sys\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.055761 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.055693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-proc\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156582 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-lib-modules\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156714 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54pn2\" (UniqueName: \"kubernetes.io/projected/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-kube-api-access-54pn2\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156714 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-podres\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156714 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-sys\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156714 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-sys\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156887 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-lib-modules\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156887 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-proc\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156887 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-podres\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.156887 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.156805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-proc\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.164020 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.163993 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54pn2\" (UniqueName: \"kubernetes.io/projected/895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9-kube-api-access-54pn2\") pod \"perf-node-gather-daemonset-hqw4w\" (UID: \"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.288814 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.288784 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:42.400357 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.400326 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w"] Apr 16 14:22:42.403478 ip-10-0-140-59 kubenswrapper[2572]: W0416 14:22:42.403439 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod895b374e_d0d9_4c3b_a8b1_a6ddf4dbb6f9.slice/crio-4b5c37b48500032584f8fdc16258cea47226ad411d03d7ca7c5dd05de2267e39 WatchSource:0}: Error finding container 4b5c37b48500032584f8fdc16258cea47226ad411d03d7ca7c5dd05de2267e39: Status 404 returned error can't find the container with id 4b5c37b48500032584f8fdc16258cea47226ad411d03d7ca7c5dd05de2267e39 Apr 16 14:22:42.405325 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.405308 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:22:42.497386 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.497357 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nbj9l_89be9464-73dc-4031-a7c8-03fa1b9164f2/dns/0.log" Apr 16 14:22:42.520646 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.520624 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nbj9l_89be9464-73dc-4031-a7c8-03fa1b9164f2/kube-rbac-proxy/0.log" Apr 16 14:22:42.681013 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:42.680938 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lnsdn_9b706a4d-2ea5-4651-bfda-d3c5cdc3fe5d/dns-node-resolver/0.log" Apr 16 14:22:43.068472 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:43.068425 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5c4c8c4dc6-5btcw_ebcf4857-d3a9-4b7e-aa16-45f0fa462b95/registry/0.log" Apr 16 14:22:43.132469 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:43.132394 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5p4kh_e11c6740-8d55-4673-8d82-f90f6a93b413/node-ca/0.log" Apr 16 14:22:43.193511 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:43.193475 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" event={"ID":"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9","Type":"ContainerStarted","Data":"c90e946750edf1dfe7e571d2be6e1ed8e1eb683109806231816a698338ddc3c1"} Apr 16 14:22:43.193659 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:43.193516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" event={"ID":"895b374e-d0d9-4c3b-a8b1-a6ddf4dbb6f9","Type":"ContainerStarted","Data":"4b5c37b48500032584f8fdc16258cea47226ad411d03d7ca7c5dd05de2267e39"} Apr 16 14:22:43.193659 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:43.193592 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:43.209329 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:43.209284 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" podStartSLOduration=2.20927004 podStartE2EDuration="2.20927004s" podCreationTimestamp="2026-04-16 14:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:22:43.207640127 +0000 UTC m=+1428.363975168" watchObservedRunningTime="2026-04-16 14:22:43.20927004 +0000 UTC m=+1428.365605119" Apr 16 14:22:44.216769 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:44.216736 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4mk8h_0707b413-706b-4c25-9e10-ea274017e762/serve-healthcheck-canary/0.log" Apr 16 14:22:44.694367 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:44.694324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pfcp5_e6fafb01-43fc-4828-8367-8e3b641523ae/kube-rbac-proxy/0.log" Apr 16 14:22:44.714885 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:44.714858 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pfcp5_e6fafb01-43fc-4828-8367-8e3b641523ae/exporter/0.log" Apr 16 14:22:44.736195 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:44.736175 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pfcp5_e6fafb01-43fc-4828-8367-8e3b641523ae/extractor/0.log" Apr 16 14:22:49.205413 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:49.205382 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-hqw4w" Apr 16 14:22:51.811226 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:51.811199 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb72s_eee60be0-add8-410b-982e-1aa1f11ec111/kube-multus-additional-cni-plugins/0.log" Apr 16 14:22:51.837383 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:51.837349 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb72s_eee60be0-add8-410b-982e-1aa1f11ec111/egress-router-binary-copy/0.log" Apr 16 14:22:51.861320 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:51.861298 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb72s_eee60be0-add8-410b-982e-1aa1f11ec111/cni-plugins/0.log" Apr 16 14:22:51.884288 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:51.884267 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb72s_eee60be0-add8-410b-982e-1aa1f11ec111/bond-cni-plugin/0.log" Apr 16 14:22:51.909075 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:51.909008 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb72s_eee60be0-add8-410b-982e-1aa1f11ec111/routeoverride-cni/0.log" Apr 16 14:22:51.930894 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:51.930871 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb72s_eee60be0-add8-410b-982e-1aa1f11ec111/whereabouts-cni-bincopy/0.log" Apr 16 14:22:51.952571 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:51.952549 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb72s_eee60be0-add8-410b-982e-1aa1f11ec111/whereabouts-cni/0.log" Apr 16 14:22:52.338216 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:52.338190 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv4ws_ccd40d4c-3ebd-4f04-b7c0-a807865cb3c7/kube-multus/0.log" Apr 16 14:22:52.359837 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:52.359814 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6bp8d_91ffb15b-8d84-4a65-a157-65c7adaca0ea/network-metrics-daemon/0.log" Apr 16 14:22:52.381518 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:52.381487 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6bp8d_91ffb15b-8d84-4a65-a157-65c7adaca0ea/kube-rbac-proxy/0.log" Apr 16 14:22:53.844151 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:53.844116 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppbcs_40fcc100-8a15-40b9-a4d8-8c9913394f91/ovn-controller/0.log" Apr 16 14:22:53.872889 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:53.872861 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppbcs_40fcc100-8a15-40b9-a4d8-8c9913394f91/ovn-acl-logging/0.log" Apr 16 14:22:53.890982 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:53.890959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppbcs_40fcc100-8a15-40b9-a4d8-8c9913394f91/kube-rbac-proxy-node/0.log" Apr 16 14:22:53.910756 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:53.910731 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppbcs_40fcc100-8a15-40b9-a4d8-8c9913394f91/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:22:53.930906 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:53.930882 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppbcs_40fcc100-8a15-40b9-a4d8-8c9913394f91/northd/0.log" Apr 16 14:22:53.954696 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:53.954670 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppbcs_40fcc100-8a15-40b9-a4d8-8c9913394f91/nbdb/0.log" Apr 16 14:22:53.982788 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:53.982757 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppbcs_40fcc100-8a15-40b9-a4d8-8c9913394f91/sbdb/0.log" Apr 16 14:22:54.082251 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:54.082222 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppbcs_40fcc100-8a15-40b9-a4d8-8c9913394f91/ovnkube-controller/0.log" Apr 16 14:22:55.075646 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:55.075613 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bct9b_fb8eecda-88c7-4d10-97ed-5f758d438dc2/network-check-target-container/0.log" Apr 16 14:22:56.043266 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:56.043229 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kpzwx_01b92520-2c04-454e-8a4a-c542e9075e22/iptables-alerter/0.log" Apr 16 14:22:56.665442 ip-10-0-140-59 kubenswrapper[2572]: I0416 14:22:56.665411 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5rc24_d63d695c-f063-4981-a993-07dd8b11f193/tuned/0.log"