Apr 17 07:49:03.509552 ip-10-0-142-45 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:49:03.509570 ip-10-0-142-45 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:49:03.509581 ip-10-0-142-45 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:49:03.509925 ip-10-0-142-45 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:49:13.601480 ip-10-0-142-45 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:49:13.601497 ip-10-0-142-45 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ae2e1fb09de54ab09728d01ff12efb7d -- Apr 17 07:51:46.172193 ip-10-0-142-45 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:51:46.615153 ip-10-0-142-45 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:46.615153 ip-10-0-142-45 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:51:46.615153 ip-10-0-142-45 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:46.615153 ip-10-0-142-45 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:51:46.615153 ip-10-0-142-45 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:46.616705 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.616619 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:51:46.621680 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621665 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:46.621680 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621679 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621683 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621687 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621691 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621695 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621697 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621700 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621703 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621706 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621709 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621712 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621714 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621717 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621720 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621723 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621725 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621728 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621731 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621739 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621742 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:46.621743 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621745 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621748 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621751 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621753 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621756 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621759 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621761 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621764 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621767 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621769 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621771 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621774 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621776 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621779 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621782 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621786 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621790 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621792 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621795 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:46.622375 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621798 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621800 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621803 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621805 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621807 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621810 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621813 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621815 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621818 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621820 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621822 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621825 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621828 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621830 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621833 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621836 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621838 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621841 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621845 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621849 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:46.623077 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621851 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621854 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621857 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621862 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621864 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621867 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621869 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621871 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621874 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621876 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621890 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621894 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621896 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621899 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621901 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621904 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621906 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621910 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621912 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621915 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:46.623558 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621917 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621920 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621922 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621925 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621927 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.621930 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623648 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623659 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623663 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623666 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623669 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623671 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623674 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623677 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623680 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623683 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623685 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623688 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623691 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623694 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:46.624041 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623697 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623700 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623702 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623705 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623708 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623710 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623713 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623715 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623718 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623720 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623722 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623729 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623732 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623735 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623738 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623741 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623746 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623748 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623751 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:46.624501 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623755 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623759 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623762 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623765 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623768 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623771 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623773 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623776 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623778 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623781 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623784 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623786 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623788 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623791 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623794 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623796 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623799 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623801 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623805 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:46.625098 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623808 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623811 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623813 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623816 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623818 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623821 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623824 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623826 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623829 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623831 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623833 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623836 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623838 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623842 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623845 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623847 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623849 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623852 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623854 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623856 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:46.625592 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623859 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623862 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623864 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623866 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623869 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623871 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623873 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623876 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623878 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623898 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623901 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623903 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623906 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.623908 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.623977 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.623984 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.623991 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.623996 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624000 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624003 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624007 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:51:46.626096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624011 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624014 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624018 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624021 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624025 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624028 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624031 2579 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624034 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624036 2579 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624039 2579 flags.go:64] FLAG: --cloud-config="" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624042 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624045 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624049 2579 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624051 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624055 2579 flags.go:64] FLAG: --config-dir="" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624057 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624060 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624068 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624071 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624074 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624077 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624080 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624082 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624085 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624088 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:51:46.626608 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624091 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624095 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624097 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624100 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624103 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624105 2579 flags.go:64] FLAG: --enable-server="true" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624108 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624113 2579 flags.go:64] FLAG: --event-burst="100" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624116 2579 flags.go:64] FLAG: --event-qps="50" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624119 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624122 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624125 2579 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624129 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624132 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624135 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624138 2579 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624141 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624144 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624146 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624149 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624152 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624155 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624158 2579 flags.go:64] FLAG: --feature-gates="" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624161 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624164 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:51:46.627212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624168 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624171 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624174 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624177 2579 flags.go:64] FLAG: --help="false" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624180 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-142-45.ec2.internal" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624182 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624185 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624188 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624191 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624194 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624197 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624199 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624202 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624204 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624207 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624210 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624213 2579 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624216 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624219 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624222 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624225 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624228 2579 flags.go:64] FLAG: --lock-file="" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624230 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624233 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:51:46.627783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624236 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624241 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624243 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624246 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624249 2579 flags.go:64] FLAG: --logging-format="text" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624251 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624255 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624257 2579 flags.go:64] FLAG: --manifest-url="" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624260 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624264 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624267 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624271 2579 flags.go:64] FLAG: --max-pods="110" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624274 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624277 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624279 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624282 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624285 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624288 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624291 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624297 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624300 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624303 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624306 2579 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:51:46.628358 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624308 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624313 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624315 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624324 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624326 2579 flags.go:64] FLAG: --port="10250" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624329 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624332 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e91777e0a2234912" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624334 2579 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624337 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624340 2579 flags.go:64] FLAG: --register-node="true" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624342 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624345 2579 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624348 2579 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624351 2579 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624353 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624356 2579 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624359 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624362 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624365 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624368 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624371 2579 flags.go:64] FLAG: --runonce="false" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624374 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624376 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624379 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624382 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624385 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:51:46.628905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624388 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624391 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624394 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624396 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624399 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624401 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624404 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624406 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624409 2579 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624413 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624418 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624421 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624423 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624427 2579 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624430 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624432 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624435 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624437 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624440 2579 flags.go:64] FLAG: --v="2" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624444 2579 flags.go:64] FLAG: --version="false" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624448 2579 flags.go:64] FLAG: --vmodule="" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624452 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624455 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624536 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624540 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:46.629565 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624542 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624547 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624550 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624553 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624555 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624559 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624561 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624564 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624566 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624569 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624571 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624573 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624576 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624578 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624581 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624583 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624587 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624589 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624592 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:46.630169 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624594 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624596 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624599 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624601 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624604 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624607 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624609 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624612 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624614 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624616 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624619 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624622 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624626 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624629 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624631 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624634 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624636 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624639 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624642 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:46.630652 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624644 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624647 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624649 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624652 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624654 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624656 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624658 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624661 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624663 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624665 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624669 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624673 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624675 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624677 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624680 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624682 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624684 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624687 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624689 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624692 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:46.631118 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624695 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624697 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624699 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624701 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624704 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624706 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624708 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624711 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624713 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624716 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624719 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624722 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624724 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624727 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624731 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624734 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624737 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624739 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624742 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624744 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:46.631595 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624747 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624749 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624753 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624756 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624758 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.624760 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.624768 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.631190 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.631206 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631252 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631257 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631260 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631263 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631266 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631269 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631272 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:46.632079 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631275 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631278 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631280 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631283 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631285 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631289 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631291 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631294 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631296 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631299 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631301 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631304 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631306 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631308 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631311 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631313 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631317 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631321 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631324 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:46.632468 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631326 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631329 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631331 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631334 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631336 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631339 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631342 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631345 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631347 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631350 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631352 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631356 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631360 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631362 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631365 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631367 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631370 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631372 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631374 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:46.632984 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631377 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631380 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631382 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631384 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631387 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631389 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631391 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631394 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631396 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631398 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631400 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631403 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631405 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631407 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631410 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631412 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631414 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631417 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631420 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631422 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:46.633487 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631425 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631427 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631429 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631432 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631434 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631437 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631440 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631442 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631444 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631446 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631449 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631451 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631454 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631457 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631459 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631461 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631464 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631466 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631468 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631471 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:46.633991 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631473 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.631478 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631602 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631606 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631610 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631614 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631616 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631619 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631622 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631625 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631628 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631631 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631634 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631638 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:46.634495 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631641 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631643 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631646 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631649 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631651 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631654 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631656 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631659 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631661 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631663 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631666 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631668 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631672 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631674 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631676 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631678 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631681 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631683 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631686 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631688 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:46.634837 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631691 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631693 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631695 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631698 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631700 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631702 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631705 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631708 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631710 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631713 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631715 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631717 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631719 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631722 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631724 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631726 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631728 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631731 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631733 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631735 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:46.635327 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631738 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631740 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631742 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631744 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631747 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631749 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631752 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631754 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631757 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631759 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631762 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631764 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631766 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631769 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631771 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631773 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631776 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631778 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631780 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631783 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:46.635820 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631785 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631788 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631790 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631792 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631795 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631797 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631799 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631802 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631805 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631807 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631809 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631812 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631814 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:46.631816 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.631820 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:46.636309 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.632592 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:51:46.636664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.635523 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:51:46.636664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.636527 2579 server.go:1019] "Starting client certificate rotation" Apr 17 07:51:46.636664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.636626 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:46.636664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.636662 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:46.663550 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.663528 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:46.666621 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.666602 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:46.686094 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.686070 2579 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:51:46.692499 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.692481 2579 log.go:25] "Validated CRI v1 image API" Apr 17 07:51:46.694004 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.693917 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:51:46.697713 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.697695 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:46.698455 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.698437 2579 fs.go:135] Filesystem UUIDs: map[6f259580-bae3-4514-8929-8599cafb2d78:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9e03234d-f8f0-4022-ac23-160e34aeb818:/dev/nvme0n1p4] Apr 17 07:51:46.698502 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.698457 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:51:46.705181 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.705078 2579 manager.go:217] Machine: {Timestamp:2026-04-17 07:51:46.702147594 +0000 UTC m=+0.411751098 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3131924 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec252334df500f9f292d8fa43911c544 SystemUUID:ec252334-df50-0f9f-292d-8fa43911c544 BootID:ae2e1fb0-9de5-4ab0-9728-d01ff12efb7d Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5f:dd:11:ce:0f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5f:dd:11:ce:0f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:20:81:fb:87:bc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:51:46.705181 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.705176 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:51:46.705311 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.705258 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:51:46.707459 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.707439 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:51:46.707622 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.707461 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-45.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:51:46.707663 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.707632 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:51:46.707663 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.707640 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:51:46.707663 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.707656 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:46.709445 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.709433 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:46.711070 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.711061 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:46.711177 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.711169 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:51:46.713783 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.713773 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:51:46.714223 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.714212 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:51:46.714265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.714237 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:51:46.714265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.714248 2579 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:51:46.714265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.714257 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:51:46.715424 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.715413 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:46.715474 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.715430 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:46.719091 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.719071 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:51:46.720272 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.720252 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sl5dj" Apr 17 07:51:46.721529 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.721516 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:51:46.723143 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723123 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723154 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723166 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723177 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723190 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723198 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723207 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723215 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723225 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:51:46.723238 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723234 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:51:46.723462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723265 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:51:46.723462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.723279 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:51:46.725096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.725083 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:51:46.725131 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.725099 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:51:46.725236 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.725207 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-45.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 07:51:46.725299 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.725207 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 07:51:46.727503 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.727473 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sl5dj" Apr 17 07:51:46.729096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.729082 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:51:46.729146 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.729130 2579 server.go:1295] "Started kubelet" Apr 17 07:51:46.729234 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.729200 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:51:46.729341 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.729248 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:51:46.729341 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.729326 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:51:46.729982 ip-10-0-142-45 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:51:46.730728 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.730697 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:51:46.736484 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.736467 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:51:46.741828 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.741811 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:46.742364 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.742337 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:51:46.743054 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743037 2579 factory.go:55] Registering systemd factory Apr 17 07:51:46.743139 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743058 2579 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:51:46.743139 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743076 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:51:46.743139 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743094 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:51:46.743291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743171 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:51:46.743291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743208 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:51:46.743291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743218 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:51:46.743291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743277 2579 factory.go:153] Registering CRI-O factory Apr 17 07:51:46.743291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743291 2579 factory.go:223] Registration of the crio container factory successfully Apr 17 07:51:46.743474 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.743313 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:46.743474 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743339 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:51:46.743474 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743355 2579 factory.go:103] Registering Raw factory Apr 17 07:51:46.743474 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.743371 2579 manager.go:1196] Started watching for new ooms in manager Apr 17 07:51:46.743641 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.743569 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 07:51:46.744415 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.744401 2579 manager.go:319] Starting recovery of all containers Apr 17 07:51:46.749203 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.749055 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:46.753404 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.753390 2579 manager.go:324] Recovery completed Apr 17 07:51:46.755294 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.755273 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-142-45.ec2.internal" not found Apr 17 07:51:46.756105 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.756088 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-45.ec2.internal\" not found" node="ip-10-0-142-45.ec2.internal" Apr 17 07:51:46.758042 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.758030 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:46.760600 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.760586 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:46.760665 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.760613 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:46.760665 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.760623 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:46.761131 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.761115 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:51:46.761131 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.761126 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:51:46.761192 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.761142 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:46.763756 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.763744 2579 policy_none.go:49] "None policy: Start" Apr 17 07:51:46.763797 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.763759 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:51:46.763797 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.763768 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:51:46.778485 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.778468 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-142-45.ec2.internal" not found Apr 17 07:51:46.804447 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.804431 2579 manager.go:341] "Starting Device Plugin manager" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.804457 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.804467 2579 server.go:85] "Starting device plugin registration server" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.804673 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.804687 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.804814 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.804929 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.804938 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.805374 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:51:46.816987 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.805412 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:46.841950 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.841935 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-142-45.ec2.internal" not found Apr 17 07:51:46.875570 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.875508 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:51:46.876786 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.876772 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:51:46.876836 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.876797 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:51:46.876836 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.876815 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:51:46.876836 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.876822 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:51:46.876972 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.876853 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:51:46.881421 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.881404 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:46.906302 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.904944 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:46.907277 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.907262 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:46.907356 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.907292 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:46.907356 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.907301 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:46.907356 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.907321 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-45.ec2.internal" Apr 17 07:51:46.921684 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.921668 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-45.ec2.internal" Apr 17 07:51:46.921728 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.921687 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-45.ec2.internal\": node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:46.959427 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:46.959410 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:46.977463 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.977427 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal"] Apr 17 07:51:46.977531 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.977503 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:46.978954 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.978941 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:46.979017 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.978968 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:46.979017 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.978977 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:46.980057 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980045 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:46.980181 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980167 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" Apr 17 07:51:46.980228 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980192 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:46.980649 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980636 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:46.980708 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980660 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:46.980708 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980636 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:46.980708 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980672 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:46.980708 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980679 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:46.980708 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.980688 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:46.981795 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.981781 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:46.981849 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.981809 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:46.982563 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.982549 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:46.982634 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.982577 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:46.982634 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:46.982586 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:47.009488 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.009469 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-45.ec2.internal\" not found" node="ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.013869 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.013854 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-45.ec2.internal\" not found" node="ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.044322 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.044300 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/16712333c9c8322032c37f859cfbedd3-config\") pod \"kube-apiserver-proxy-ip-10-0-142-45.ec2.internal\" (UID: \"16712333c9c8322032c37f859cfbedd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.044373 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.044330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3990e5263d6ea7de7b84e806840ba42e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal\" (UID: \"3990e5263d6ea7de7b84e806840ba42e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.044373 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.044348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3990e5263d6ea7de7b84e806840ba42e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal\" (UID: \"3990e5263d6ea7de7b84e806840ba42e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.060505 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.060475 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.144532 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.144510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/16712333c9c8322032c37f859cfbedd3-config\") pod \"kube-apiserver-proxy-ip-10-0-142-45.ec2.internal\" (UID: \"16712333c9c8322032c37f859cfbedd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.144626 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.144537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3990e5263d6ea7de7b84e806840ba42e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal\" (UID: \"3990e5263d6ea7de7b84e806840ba42e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.144626 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.144555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3990e5263d6ea7de7b84e806840ba42e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal\" (UID: \"3990e5263d6ea7de7b84e806840ba42e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.144626 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.144603 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3990e5263d6ea7de7b84e806840ba42e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal\" (UID: \"3990e5263d6ea7de7b84e806840ba42e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.144626 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.144608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/16712333c9c8322032c37f859cfbedd3-config\") pod \"kube-apiserver-proxy-ip-10-0-142-45.ec2.internal\" (UID: \"16712333c9c8322032c37f859cfbedd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.144741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.144640 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3990e5263d6ea7de7b84e806840ba42e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal\" (UID: \"3990e5263d6ea7de7b84e806840ba42e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.161523 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.161505 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.262228 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.262189 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.312401 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.312372 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.315929 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.315913 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:47.363044 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.363023 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.463589 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.463516 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.564078 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.564046 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.636246 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.636224 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:51:47.636850 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.636376 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:47.636850 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.636411 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:47.664772 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.664751 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.730278 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.730206 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:46:46 +0000 UTC" deadline="2027-09-21 20:06:03.48231113 +0000 UTC" Apr 17 07:51:47.730278 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.730235 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12540h14m15.752078293s" Apr 17 07:51:47.742363 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.742342 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:47.765321 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.765299 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.773776 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.773756 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:47.790560 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.790539 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gzwwt" Apr 17 07:51:47.798349 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.798331 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gzwwt" Apr 17 07:51:47.857472 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:47.857437 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3990e5263d6ea7de7b84e806840ba42e.slice/crio-a3997afd42684fd39d1b9950eb7d59d0e7f9f66b3d662166d05c9a2c87cacbf5 WatchSource:0}: Error finding container a3997afd42684fd39d1b9950eb7d59d0e7f9f66b3d662166d05c9a2c87cacbf5: Status 404 returned error can't find the container with id a3997afd42684fd39d1b9950eb7d59d0e7f9f66b3d662166d05c9a2c87cacbf5 Apr 17 07:51:47.857670 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:47.857652 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16712333c9c8322032c37f859cfbedd3.slice/crio-b1fae074c75cfd212598f189ba675b6eb4359f3e394175c994aadbce03348f80 WatchSource:0}: Error finding container b1fae074c75cfd212598f189ba675b6eb4359f3e394175c994aadbce03348f80: Status 404 returned error can't find the container with id b1fae074c75cfd212598f189ba675b6eb4359f3e394175c994aadbce03348f80 Apr 17 07:51:47.861492 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.861477 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:51:47.865960 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.865941 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:47.879593 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.879556 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" event={"ID":"3990e5263d6ea7de7b84e806840ba42e","Type":"ContainerStarted","Data":"a3997afd42684fd39d1b9950eb7d59d0e7f9f66b3d662166d05c9a2c87cacbf5"} Apr 17 07:51:47.880447 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:47.880429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" event={"ID":"16712333c9c8322032c37f859cfbedd3","Type":"ContainerStarted","Data":"b1fae074c75cfd212598f189ba675b6eb4359f3e394175c994aadbce03348f80"} Apr 17 07:51:47.966965 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:47.966925 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:48.067577 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.067487 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:48.168162 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.168136 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:48.186548 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.186528 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:48.268586 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.268537 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-45.ec2.internal\" not found" Apr 17 07:51:48.328534 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.328455 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:48.343646 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.343453 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" Apr 17 07:51:48.356215 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.356193 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:48.357276 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.357218 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" Apr 17 07:51:48.368936 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.368807 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:48.715831 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.715800 2579 apiserver.go:52] "Watching apiserver" Apr 17 07:51:48.728663 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.728640 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:51:48.731445 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.731410 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h","openshift-dns/node-resolver-62tj8","openshift-image-registry/node-ca-mvk82","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal","openshift-multus/multus-additional-cni-plugins-wg6r2","openshift-multus/multus-dds6t","openshift-cluster-node-tuning-operator/tuned-fnbbr","openshift-multus/network-metrics-daemon-qq9zp","openshift-network-diagnostics/network-check-target-ml85w","openshift-network-operator/iptables-alerter-7m6g5","openshift-ovn-kubernetes/ovnkube-node-ldxs6","kube-system/konnectivity-agent-862qc","kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal"] Apr 17 07:51:48.734000 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.733979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:48.734101 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.734056 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:51:48.735080 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.735054 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.736757 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.736410 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.737646 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.737433 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-862qc" Apr 17 07:51:48.737646 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.737558 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.740682 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.740422 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:51:48.740682 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.740432 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qw9hw\"" Apr 17 07:51:48.740682 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.740422 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:51:48.740828 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.740740 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.741596 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.741478 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:51:48.741596 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.741521 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:51:48.741754 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.741740 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:51:48.741806 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.741740 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rf8rr\"" Apr 17 07:51:48.742159 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.742142 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.743131 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.743115 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:51:48.743306 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.743263 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9t7p7\"" Apr 17 07:51:48.743655 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.743386 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.744741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.744451 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zpq7q\"" Apr 17 07:51:48.744741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.744467 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:51:48.744741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.744483 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:51:48.744741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.744533 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:51:48.744741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.744554 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:51:48.744741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.744473 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:51:48.744741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.744642 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:51:48.745512 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.745494 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:51:48.745678 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.745653 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nzr8b\"" Apr 17 07:51:48.745911 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.745847 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:48.745999 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.745958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.745999 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.745967 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:51:48.746753 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.746477 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:51:48.746753 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.746538 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wxbs8\"" Apr 17 07:51:48.746753 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.746607 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:51:48.746935 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.746840 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:51:48.747341 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.747322 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.747760 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.747743 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:48.749370 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.749353 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:48.749496 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.749476 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-w9vhh\"" Apr 17 07:51:48.749496 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.749370 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:51:48.750604 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.750588 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:48.750702 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.750623 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:48.750862 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.750846 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-crfbz\"" Apr 17 07:51:48.752046 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.752028 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:51:48.752141 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.752119 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:51:48.752194 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.752143 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8k57d\"" Apr 17 07:51:48.752387 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.752372 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:51:48.753138 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.752957 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:51:48.753138 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.752965 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:51:48.753138 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.753020 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:51:48.754909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.754870 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwb66\" (UniqueName: \"kubernetes.io/projected/83dbb733-5a1c-4565-b720-5b5bf99f74b8-kube-api-access-vwb66\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.754998 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.754975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-device-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.755040 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dp7n\" (UniqueName: \"kubernetes.io/projected/9066e076-2a8d-4930-9ca7-ec84bd3426c6-kube-api-access-6dp7n\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.755040 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755029 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-host-slash\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.755127 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755078 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdnj\" (UniqueName: \"kubernetes.io/projected/c3da12b2-c527-43e5-96c7-37b56fb6b22d-kube-api-access-trdnj\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.755127 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-os-release\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755216 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755152 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-kubelet\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755216 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-conf-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755308 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755217 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4436d074-134b-4347-bddd-5a274dc24549-multus-daemon-config\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755308 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755241 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5343931-acc9-4a96-81bd-fc6bbad4d9be-tmp-dir\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.755308 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3da12b2-c527-43e5-96c7-37b56fb6b22d-host\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.755308 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755289 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-cnibin\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755488 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755315 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-k8s-cni-cncf-io\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755488 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755339 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-cni-multus\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755488 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755363 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/954c8165-6465-403b-9b5d-f03c3bcce354-agent-certs\") pod \"konnectivity-agent-862qc\" (UID: \"954c8165-6465-403b-9b5d-f03c3bcce354\") " pod="kube-system/konnectivity-agent-862qc" Apr 17 07:51:48.755488 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755385 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-os-release\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.755488 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755409 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-registration-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.755488 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-socket-dir-parent\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755488 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-multus-certs\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-etc-kubernetes\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755516 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9nt7\" (UniqueName: \"kubernetes.io/projected/4436d074-134b-4347-bddd-5a274dc24549-kube-api-access-n9nt7\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755574 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5343931-acc9-4a96-81bd-fc6bbad4d9be-hosts-file\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755602 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-socket-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-etc-selinux\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755636 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nffw5\" (UniqueName: \"kubernetes.io/projected/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-kube-api-access-nffw5\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755670 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755715 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-cni-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.755775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755752 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-netns\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-hostroot\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/954c8165-6465-403b-9b5d-f03c3bcce354-konnectivity-ca\") pod \"konnectivity-agent-862qc\" (UID: \"954c8165-6465-403b-9b5d-f03c3bcce354\") " pod="kube-system/konnectivity-agent-862qc" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755906 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cni-binary-copy\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755942 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-iptables-alerter-script\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755968 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-cni-bin\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.755993 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-system-cni-dir\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756016 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xss2f\" (UniqueName: \"kubernetes.io/projected/2b638776-5000-47ee-92c9-8ba7655f560c-kube-api-access-xss2f\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756068 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-system-cni-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756093 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g77p9\" (UniqueName: \"kubernetes.io/projected/e5343931-acc9-4a96-81bd-fc6bbad4d9be-kube-api-access-g77p9\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cnibin\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.756224 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756221 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-sys-fs\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.756631 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:48.756675 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756276 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3da12b2-c527-43e5-96c7-37b56fb6b22d-serviceca\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.756726 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756689 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4436d074-134b-4347-bddd-5a274dc24549-cni-binary-copy\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.756726 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.756704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.759717 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.759695 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:48.799110 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.799077 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:47 +0000 UTC" deadline="2027-11-01 17:52:55.066508596 +0000 UTC" Apr 17 07:51:48.799110 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.799106 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13522h1m6.267405962s" Apr 17 07:51:48.844851 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.844822 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:51:48.856934 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.856908 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4436d074-134b-4347-bddd-5a274dc24549-cni-binary-copy\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857056 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.856951 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bwts\" (UniqueName: \"kubernetes.io/projected/d966014f-f7ed-4082-afdb-6e81d3b82816-kube-api-access-5bwts\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.857056 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.856976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-device-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.857056 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.856998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dp7n\" (UniqueName: \"kubernetes.io/projected/9066e076-2a8d-4930-9ca7-ec84bd3426c6-kube-api-access-6dp7n\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.857056 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-host-slash\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.857247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trdnj\" (UniqueName: \"kubernetes.io/projected/c3da12b2-c527-43e5-96c7-37b56fb6b22d-kube-api-access-trdnj\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.857247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857099 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-device-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.857247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857112 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-conf-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857152 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-log-socket\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.857247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857155 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-host-slash\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.857247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857154 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-conf-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3da12b2-c527-43e5-96c7-37b56fb6b22d-host\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.857247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857218 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-cnibin\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3da12b2-c527-43e5-96c7-37b56fb6b22d-host\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857273 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-cni-multus\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-cnibin\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-cni-multus\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/954c8165-6465-403b-9b5d-f03c3bcce354-agent-certs\") pod \"konnectivity-agent-862qc\" (UID: \"954c8165-6465-403b-9b5d-f03c3bcce354\") " pod="kube-system/konnectivity-agent-862qc" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857350 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-os-release\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nffw5\" (UniqueName: \"kubernetes.io/projected/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-kube-api-access-nffw5\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857408 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-modprobe-d\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857431 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysconfig\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-os-release\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-multus-certs\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9nt7\" (UniqueName: \"kubernetes.io/projected/4436d074-134b-4347-bddd-5a274dc24549-kube-api-access-n9nt7\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857507 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5343931-acc9-4a96-81bd-fc6bbad4d9be-hosts-file\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857525 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4436d074-134b-4347-bddd-5a274dc24549-cni-binary-copy\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-socket-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-ovn\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.857578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857579 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysctl-conf\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857585 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5343931-acc9-4a96-81bd-fc6bbad4d9be-hosts-file\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f36b281-ef72-4eb5-963a-ef189f7f1559-tmp\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-socket-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.857687 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-cni-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857746 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-cni-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.857813 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:51:49.357761402 +0000 UTC m=+3.067364916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/954c8165-6465-403b-9b5d-f03c3bcce354-konnectivity-ca\") pod \"konnectivity-agent-862qc\" (UID: \"954c8165-6465-403b-9b5d-f03c3bcce354\") " pod="kube-system/konnectivity-agent-862qc" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857876 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cni-binary-copy\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857917 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-etc-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857949 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysctl-d\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857965 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-cni-bin\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cnibin\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.858351 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.857509 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-multus-certs\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-cni-bin\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858026 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858054 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858066 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cnibin\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-systemd\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858108 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-ovnkube-config\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858136 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-slash\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwb66\" (UniqueName: \"kubernetes.io/projected/83dbb733-5a1c-4565-b720-5b5bf99f74b8-kube-api-access-vwb66\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858246 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-cni-bin\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-run\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858297 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-lib-modules\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858323 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-var-lib-kubelet\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-os-release\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-kubelet\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859156 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/954c8165-6465-403b-9b5d-f03c3bcce354-konnectivity-ca\") pod \"konnectivity-agent-862qc\" (UID: \"954c8165-6465-403b-9b5d-f03c3bcce354\") " pod="kube-system/konnectivity-agent-862qc" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858427 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cni-binary-copy\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4436d074-134b-4347-bddd-5a274dc24549-multus-daemon-config\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5343931-acc9-4a96-81bd-fc6bbad4d9be-tmp-dir\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-var-lib-kubelet\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858511 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858559 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-os-release\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858505 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-systemd-units\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858614 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-env-overrides\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858620 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/83dbb733-5a1c-4565-b720-5b5bf99f74b8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-kubernetes\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-k8s-cni-cncf-io\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858718 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-k8s-cni-cncf-io\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858743 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-registration-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-etc-selinux\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5343931-acc9-4a96-81bd-fc6bbad4d9be-tmp-dir\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.859909 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-systemd\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-socket-dir-parent\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858848 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-registration-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-etc-kubernetes\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858877 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-etc-selinux\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858912 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-tuned\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858918 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-multus-socket-dir-parent\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858936 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nhw\" (UniqueName: \"kubernetes.io/projected/9f36b281-ef72-4eb5-963a-ef189f7f1559-kube-api-access-88nhw\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858938 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-etc-kubernetes\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4436d074-134b-4347-bddd-5a274dc24549-multus-daemon-config\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858967 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-netns\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.858992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-hostroot\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-iptables-alerter-script\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-host-run-netns\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859047 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-cni-netd\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-sys\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-host\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.860458 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-system-cni-dir\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-hostroot\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859269 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-kubelet\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859285 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/83dbb733-5a1c-4565-b720-5b5bf99f74b8-system-cni-dir\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859315 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-var-lib-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859339 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-node-log\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-run-ovn-kubernetes\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xss2f\" (UniqueName: \"kubernetes.io/projected/2b638776-5000-47ee-92c9-8ba7655f560c-kube-api-access-xss2f\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-system-cni-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g77p9\" (UniqueName: \"kubernetes.io/projected/e5343931-acc9-4a96-81bd-fc6bbad4d9be-kube-api-access-g77p9\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-sys-fs\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-run-netns\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859496 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859512 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4436d074-134b-4347-bddd-5a274dc24549-system-cni-dir\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d966014f-f7ed-4082-afdb-6e81d3b82816-ovn-node-metrics-cert\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.861001 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-ovnkube-script-lib\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.861582 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3da12b2-c527-43e5-96c7-37b56fb6b22d-serviceca\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.861582 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859553 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9066e076-2a8d-4930-9ca7-ec84bd3426c6-sys-fs\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.861582 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859971 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3da12b2-c527-43e5-96c7-37b56fb6b22d-serviceca\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.861582 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.859986 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-iptables-alerter-script\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.861895 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.861864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/954c8165-6465-403b-9b5d-f03c3bcce354-agent-certs\") pod \"konnectivity-agent-862qc\" (UID: \"954c8165-6465-403b-9b5d-f03c3bcce354\") " pod="kube-system/konnectivity-agent-862qc" Apr 17 07:51:48.865950 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.865931 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:48.865950 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.865953 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:48.866109 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.865966 2579 projected.go:194] Error preparing data for projected volume kube-api-access-72nml for pod openshift-network-diagnostics/network-check-target-ml85w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:48.866109 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:48.866040 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml podName:45bb327f-877b-4da6-8c2d-e760eb62707d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:49.366023631 +0000 UTC m=+3.075627136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-72nml" (UniqueName: "kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml") pod "network-check-target-ml85w" (UID: "45bb327f-877b-4da6-8c2d-e760eb62707d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:48.877905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.867748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nffw5\" (UniqueName: \"kubernetes.io/projected/2340e8a7-5618-413e-a2cf-f5ff4f985ad1-kube-api-access-nffw5\") pod \"iptables-alerter-7m6g5\" (UID: \"2340e8a7-5618-413e-a2cf-f5ff4f985ad1\") " pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:48.877905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.868061 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9nt7\" (UniqueName: \"kubernetes.io/projected/4436d074-134b-4347-bddd-5a274dc24549-kube-api-access-n9nt7\") pod \"multus-dds6t\" (UID: \"4436d074-134b-4347-bddd-5a274dc24549\") " pod="openshift-multus/multus-dds6t" Apr 17 07:51:48.877905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.868471 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dp7n\" (UniqueName: \"kubernetes.io/projected/9066e076-2a8d-4930-9ca7-ec84bd3426c6-kube-api-access-6dp7n\") pod \"aws-ebs-csi-driver-node-n6k5h\" (UID: \"9066e076-2a8d-4930-9ca7-ec84bd3426c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:48.877905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.868733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trdnj\" (UniqueName: \"kubernetes.io/projected/c3da12b2-c527-43e5-96c7-37b56fb6b22d-kube-api-access-trdnj\") pod \"node-ca-mvk82\" (UID: \"c3da12b2-c527-43e5-96c7-37b56fb6b22d\") " pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:48.877905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.869370 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g77p9\" (UniqueName: \"kubernetes.io/projected/e5343931-acc9-4a96-81bd-fc6bbad4d9be-kube-api-access-g77p9\") pod \"node-resolver-62tj8\" (UID: \"e5343931-acc9-4a96-81bd-fc6bbad4d9be\") " pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:48.877905 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.869513 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwb66\" (UniqueName: \"kubernetes.io/projected/83dbb733-5a1c-4565-b720-5b5bf99f74b8-kube-api-access-vwb66\") pod \"multus-additional-cni-plugins-wg6r2\" (UID: \"83dbb733-5a1c-4565-b720-5b5bf99f74b8\") " pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:48.879004 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.878924 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xss2f\" (UniqueName: \"kubernetes.io/projected/2b638776-5000-47ee-92c9-8ba7655f560c-kube-api-access-xss2f\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:48.959802 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959766 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-systemd-units\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-env-overrides\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-kubernetes\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-systemd\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-tuned\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88nhw\" (UniqueName: \"kubernetes.io/projected/9f36b281-ef72-4eb5-963a-ef189f7f1559-kube-api-access-88nhw\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959909 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-systemd-units\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-cni-netd\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959957 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-cni-netd\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.959980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959968 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-sys\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-host\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.959998 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-kubernetes\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960021 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-kubelet\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-systemd\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-var-lib-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-node-log\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-run-ovn-kubernetes\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-run-netns\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960170 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d966014f-f7ed-4082-afdb-6e81d3b82816-ovn-node-metrics-cert\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960192 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-ovnkube-script-lib\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bwts\" (UniqueName: \"kubernetes.io/projected/d966014f-f7ed-4082-afdb-6e81d3b82816-kube-api-access-5bwts\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960244 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-log-socket\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-modprobe-d\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysconfig\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-run-ovn-kubernetes\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.960390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960323 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-ovn\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysctl-conf\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f36b281-ef72-4eb5-963a-ef189f7f1559-tmp\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-env-overrides\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-host\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-etc-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960448 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-run-netns\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysctl-d\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960464 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-etc-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960494 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960530 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysconfig\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-node-log\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961030 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-log-socket\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-ovnkube-script-lib\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-modprobe-d\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961161 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961172 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-kubelet\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysctl-d\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961209 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-var-lib-openvswitch\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-systemd\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.960370 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-sys\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-ovnkube-config\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-slash\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-sysctl-conf\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961396 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-slash\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961398 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-ovn\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-run-systemd\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-cni-bin\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961507 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-run\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-lib-modules\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961577 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-run\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-var-lib-kubelet\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961645 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-lib-modules\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d966014f-f7ed-4082-afdb-6e81d3b82816-host-cni-bin\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.961859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f36b281-ef72-4eb5-963a-ef189f7f1559-var-lib-kubelet\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.962610 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.961916 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d966014f-f7ed-4082-afdb-6e81d3b82816-ovnkube-config\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.962610 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.962541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9f36b281-ef72-4eb5-963a-ef189f7f1559-etc-tuned\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.962741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.962722 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f36b281-ef72-4eb5-963a-ef189f7f1559-tmp\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.962922 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.962900 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d966014f-f7ed-4082-afdb-6e81d3b82816-ovn-node-metrics-cert\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:48.972996 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.972935 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nhw\" (UniqueName: \"kubernetes.io/projected/9f36b281-ef72-4eb5-963a-ef189f7f1559-kube-api-access-88nhw\") pod \"tuned-fnbbr\" (UID: \"9f36b281-ef72-4eb5-963a-ef189f7f1559\") " pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:48.974075 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:48.974056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bwts\" (UniqueName: \"kubernetes.io/projected/d966014f-f7ed-4082-afdb-6e81d3b82816-kube-api-access-5bwts\") pod \"ovnkube-node-ldxs6\" (UID: \"d966014f-f7ed-4082-afdb-6e81d3b82816\") " pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:49.009720 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.009692 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:49.047952 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.047923 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-62tj8" Apr 17 07:51:49.060670 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.060633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mvk82" Apr 17 07:51:49.068383 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.068360 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-862qc" Apr 17 07:51:49.072929 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.072900 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" Apr 17 07:51:49.079525 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.079503 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dds6t" Apr 17 07:51:49.086401 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.086370 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" Apr 17 07:51:49.092963 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.092944 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7m6g5" Apr 17 07:51:49.099577 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.099548 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" Apr 17 07:51:49.105134 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.105114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:51:49.364210 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.364121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:49.364368 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:49.364297 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:49.364427 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:49.364369 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:51:50.36435007 +0000 UTC m=+4.073953563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:49.465140 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.465097 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:49.465374 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:49.465241 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:49.465374 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:49.465255 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:49.465374 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:49.465281 2579 projected.go:194] Error preparing data for projected volume kube-api-access-72nml for pod openshift-network-diagnostics/network-check-target-ml85w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:49.465374 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:49.465330 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml podName:45bb327f-877b-4da6-8c2d-e760eb62707d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:50.465317662 +0000 UTC m=+4.174921157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-72nml" (UniqueName: "kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml") pod "network-check-target-ml85w" (UID: "45bb327f-877b-4da6-8c2d-e760eb62707d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:49.484948 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:49.484895 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f36b281_ef72_4eb5_963a_ef189f7f1559.slice/crio-f9859ca3a4351cd94a9cf0630deb2c4dbb5c701dacdc52cef336827e0043c8e3 WatchSource:0}: Error finding container f9859ca3a4351cd94a9cf0630deb2c4dbb5c701dacdc52cef336827e0043c8e3: Status 404 returned error can't find the container with id f9859ca3a4351cd94a9cf0630deb2c4dbb5c701dacdc52cef336827e0043c8e3 Apr 17 07:51:49.486654 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:49.486630 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4436d074_134b_4347_bddd_5a274dc24549.slice/crio-4626baebcecdaebe50781093d689628da86cbc109830f7c7e7ceffd958250290 WatchSource:0}: Error finding container 4626baebcecdaebe50781093d689628da86cbc109830f7c7e7ceffd958250290: Status 404 returned error can't find the container with id 4626baebcecdaebe50781093d689628da86cbc109830f7c7e7ceffd958250290 Apr 17 07:51:49.487715 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:49.487689 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9066e076_2a8d_4930_9ca7_ec84bd3426c6.slice/crio-0e570ea202f3a3508510997d1652460e20d465dedd4dc4f9c016cecc158fdad4 WatchSource:0}: Error finding container 0e570ea202f3a3508510997d1652460e20d465dedd4dc4f9c016cecc158fdad4: Status 404 returned error can't find the container with id 0e570ea202f3a3508510997d1652460e20d465dedd4dc4f9c016cecc158fdad4 Apr 17 07:51:49.488845 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:49.488826 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83dbb733_5a1c_4565_b720_5b5bf99f74b8.slice/crio-bb7aeb5d8efee02c9e3115fd647ef20b26c3293f7fb23596c344fb54fefc7aaf WatchSource:0}: Error finding container bb7aeb5d8efee02c9e3115fd647ef20b26c3293f7fb23596c344fb54fefc7aaf: Status 404 returned error can't find the container with id bb7aeb5d8efee02c9e3115fd647ef20b26c3293f7fb23596c344fb54fefc7aaf Apr 17 07:51:49.490279 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:49.490186 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod954c8165_6465_403b_9b5d_f03c3bcce354.slice/crio-cf549d4161fdc269e23b074da61ad32e386c9c02edf7d7c0513ba250c7a83515 WatchSource:0}: Error finding container cf549d4161fdc269e23b074da61ad32e386c9c02edf7d7c0513ba250c7a83515: Status 404 returned error can't find the container with id cf549d4161fdc269e23b074da61ad32e386c9c02edf7d7c0513ba250c7a83515 Apr 17 07:51:49.491664 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:49.491636 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5343931_acc9_4a96_81bd_fc6bbad4d9be.slice/crio-68d6c87ed08fdd79f0d5e064f309b707146b5e6c5fadbd6736c6b8d5545cf6a1 WatchSource:0}: Error finding container 68d6c87ed08fdd79f0d5e064f309b707146b5e6c5fadbd6736c6b8d5545cf6a1: Status 404 returned error can't find the container with id 68d6c87ed08fdd79f0d5e064f309b707146b5e6c5fadbd6736c6b8d5545cf6a1 Apr 17 07:51:49.494462 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:51:49.494238 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3da12b2_c527_43e5_96c7_37b56fb6b22d.slice/crio-8bdb89802a7e99ba1f26928dd30aafba28e9ca16e320c2ca631d07b7b47050a7 WatchSource:0}: Error finding container 8bdb89802a7e99ba1f26928dd30aafba28e9ca16e320c2ca631d07b7b47050a7: Status 404 returned error can't find the container with id 8bdb89802a7e99ba1f26928dd30aafba28e9ca16e320c2ca631d07b7b47050a7 Apr 17 07:51:49.799408 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.799369 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:47 +0000 UTC" deadline="2027-12-17 19:49:13.466252016 +0000 UTC" Apr 17 07:51:49.799408 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.799403 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14627h57m23.666851903s" Apr 17 07:51:49.878862 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.878771 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:49.879036 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:49.878998 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:51:49.886551 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.886518 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" event={"ID":"16712333c9c8322032c37f859cfbedd3","Type":"ContainerStarted","Data":"1536a10e9974ee50fcf08f2a410ada4de31c5c1496b9e5c31f73654398bc4858"} Apr 17 07:51:49.892538 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.892486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7m6g5" event={"ID":"2340e8a7-5618-413e-a2cf-f5ff4f985ad1","Type":"ContainerStarted","Data":"07b4858886cd7d35f9e277c20930eb5ab7cb94d0b6eeb4eb16676bfd9f0008b2"} Apr 17 07:51:49.906307 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.906276 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"3f17ec3870bd61a0c330b42af26086ed5cf57e779ce4686947f03a97a54205d9"} Apr 17 07:51:49.915857 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.915772 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mvk82" event={"ID":"c3da12b2-c527-43e5-96c7-37b56fb6b22d","Type":"ContainerStarted","Data":"8bdb89802a7e99ba1f26928dd30aafba28e9ca16e320c2ca631d07b7b47050a7"} Apr 17 07:51:49.916942 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.916584 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-45.ec2.internal" podStartSLOduration=1.916567838 podStartE2EDuration="1.916567838s" podCreationTimestamp="2026-04-17 07:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:49.915869356 +0000 UTC m=+3.625472870" watchObservedRunningTime="2026-04-17 07:51:49.916567838 +0000 UTC m=+3.626171350" Apr 17 07:51:49.918728 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.918704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-62tj8" event={"ID":"e5343931-acc9-4a96-81bd-fc6bbad4d9be","Type":"ContainerStarted","Data":"68d6c87ed08fdd79f0d5e064f309b707146b5e6c5fadbd6736c6b8d5545cf6a1"} Apr 17 07:51:49.920044 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.919998 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-862qc" event={"ID":"954c8165-6465-403b-9b5d-f03c3bcce354","Type":"ContainerStarted","Data":"cf549d4161fdc269e23b074da61ad32e386c9c02edf7d7c0513ba250c7a83515"} Apr 17 07:51:49.923877 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.923853 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" event={"ID":"9066e076-2a8d-4930-9ca7-ec84bd3426c6","Type":"ContainerStarted","Data":"0e570ea202f3a3508510997d1652460e20d465dedd4dc4f9c016cecc158fdad4"} Apr 17 07:51:49.926066 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.926014 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" event={"ID":"83dbb733-5a1c-4565-b720-5b5bf99f74b8","Type":"ContainerStarted","Data":"bb7aeb5d8efee02c9e3115fd647ef20b26c3293f7fb23596c344fb54fefc7aaf"} Apr 17 07:51:49.927347 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.927302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dds6t" event={"ID":"4436d074-134b-4347-bddd-5a274dc24549","Type":"ContainerStarted","Data":"4626baebcecdaebe50781093d689628da86cbc109830f7c7e7ceffd958250290"} Apr 17 07:51:49.930013 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:49.929977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" event={"ID":"9f36b281-ef72-4eb5-963a-ef189f7f1559","Type":"ContainerStarted","Data":"f9859ca3a4351cd94a9cf0630deb2c4dbb5c701dacdc52cef336827e0043c8e3"} Apr 17 07:51:50.376046 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:50.376017 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:50.376213 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:50.376197 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:50.376289 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:50.376259 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:51:52.376241891 +0000 UTC m=+6.085845394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:50.477555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:50.477517 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:50.477736 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:50.477700 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:50.477736 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:50.477718 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:50.477736 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:50.477730 2579 projected.go:194] Error preparing data for projected volume kube-api-access-72nml for pod openshift-network-diagnostics/network-check-target-ml85w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:50.477912 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:50.477802 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml podName:45bb327f-877b-4da6-8c2d-e760eb62707d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:52.477770934 +0000 UTC m=+6.187374431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-72nml" (UniqueName: "kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml") pod "network-check-target-ml85w" (UID: "45bb327f-877b-4da6-8c2d-e760eb62707d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:50.880061 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:50.880006 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:50.880537 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:50.880119 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:51:50.939757 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:50.939715 2579 generic.go:358] "Generic (PLEG): container finished" podID="3990e5263d6ea7de7b84e806840ba42e" containerID="e85056ac532dd19b3b7e6f9df1b3c88d31a947f76fb0241561325512520277e8" exitCode=0 Apr 17 07:51:50.940672 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:50.940648 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" event={"ID":"3990e5263d6ea7de7b84e806840ba42e","Type":"ContainerDied","Data":"e85056ac532dd19b3b7e6f9df1b3c88d31a947f76fb0241561325512520277e8"} Apr 17 07:51:51.877907 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:51.877857 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:51.878081 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:51.878020 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:51:51.966653 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:51.966020 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" event={"ID":"3990e5263d6ea7de7b84e806840ba42e","Type":"ContainerStarted","Data":"dd3b9af156e3ed3b086e6f7698967dda6b1c4337f7f3b5baceaa9b3ee4d52036"} Apr 17 07:51:51.984256 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:51.984206 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-45.ec2.internal" podStartSLOduration=3.9841851630000003 podStartE2EDuration="3.984185163s" podCreationTimestamp="2026-04-17 07:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:51.983481489 +0000 UTC m=+5.693085004" watchObservedRunningTime="2026-04-17 07:51:51.984185163 +0000 UTC m=+5.693788676" Apr 17 07:51:52.395105 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:52.395069 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:52.395275 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:52.395232 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:52.395334 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:52.395293 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:51:56.39527493 +0000 UTC m=+10.104878422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:52.495632 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:52.495593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:52.495814 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:52.495800 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:52.495868 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:52.495817 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:52.495868 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:52.495830 2579 projected.go:194] Error preparing data for projected volume kube-api-access-72nml for pod openshift-network-diagnostics/network-check-target-ml85w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:52.495997 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:52.495904 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml podName:45bb327f-877b-4da6-8c2d-e760eb62707d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:56.495870119 +0000 UTC m=+10.205473615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-72nml" (UniqueName: "kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml") pod "network-check-target-ml85w" (UID: "45bb327f-877b-4da6-8c2d-e760eb62707d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:52.879111 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:52.879081 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:52.879292 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:52.879215 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:51:53.877321 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:53.877287 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:53.877677 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:53.877446 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:51:54.879740 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:54.879701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:54.880144 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:54.879844 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:51:55.877642 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:55.877603 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:55.877819 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:55.877751 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:51:56.433833 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:56.433798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:56.434319 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:56.433974 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:56.434319 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:56.434032 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:04.43401424 +0000 UTC m=+18.143617735 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:56.534948 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:56.534912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:56.535130 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:56.535073 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:56.535130 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:56.535091 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:56.535130 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:56.535105 2579 projected.go:194] Error preparing data for projected volume kube-api-access-72nml for pod openshift-network-diagnostics/network-check-target-ml85w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:56.535275 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:56.535151 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml podName:45bb327f-877b-4da6-8c2d-e760eb62707d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:04.535138102 +0000 UTC m=+18.244741593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-72nml" (UniqueName: "kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml") pod "network-check-target-ml85w" (UID: "45bb327f-877b-4da6-8c2d-e760eb62707d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:56.878957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:56.878927 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:56.879112 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:56.879056 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:51:57.877568 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:57.877535 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:57.877995 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:57.877654 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:51:58.877136 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:58.877060 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:51:58.877322 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:58.877178 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:51:59.877717 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:51:59.877679 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:51:59.878246 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:51:59.877820 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:00.878042 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:00.878006 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:00.878494 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:00.878135 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:01.877611 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:01.877574 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:01.877852 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:01.877709 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:02.877668 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:02.877635 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:02.878140 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:02.877795 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:03.877833 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:03.877802 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:03.878289 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:03.877943 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:04.497320 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:04.497268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:04.497511 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:04.497411 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:04.497511 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:04.497481 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:20.497466077 +0000 UTC m=+34.207069568 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:04.597967 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:04.597938 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:04.598153 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:04.598080 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:04.598153 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:04.598094 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:04.598153 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:04.598102 2579 projected.go:194] Error preparing data for projected volume kube-api-access-72nml for pod openshift-network-diagnostics/network-check-target-ml85w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:04.598336 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:04.598164 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml podName:45bb327f-877b-4da6-8c2d-e760eb62707d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:20.598146679 +0000 UTC m=+34.307750186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-72nml" (UniqueName: "kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml") pod "network-check-target-ml85w" (UID: "45bb327f-877b-4da6-8c2d-e760eb62707d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:04.877297 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:04.877190 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:04.877452 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:04.877321 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:05.877582 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:05.877555 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:05.878005 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:05.877659 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:06.878940 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:06.878704 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:06.879621 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:06.879015 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:06.998523 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:06.998500 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 07:52:06.998831 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:06.998808 2579 generic.go:358] "Generic (PLEG): container finished" podID="d966014f-f7ed-4082-afdb-6e81d3b82816" containerID="d2cabb4cb7b6679d525e76baff2bc0f1ba3e6baf161c9b89ce2a4ec7514ce829" exitCode=1 Apr 17 07:52:06.998932 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:06.998871 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"fdf34cbd5472ca42f4f8701c6608386582ae2c289634aeda41856ddbf25fa2c2"} Apr 17 07:52:06.998932 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:06.998913 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"3c80763bc1f4f553f4f5eb9549037bf7c10e474d63d612c28a606e27d689470c"} Apr 17 07:52:06.998932 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:06.998925 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerDied","Data":"d2cabb4cb7b6679d525e76baff2bc0f1ba3e6baf161c9b89ce2a4ec7514ce829"} Apr 17 07:52:06.999050 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:06.998939 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"38a433e1abb9bf8bde4efd183fee6597ee4070a91032a47f87acf29ac59204df"} Apr 17 07:52:07.000397 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.000247 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mvk82" event={"ID":"c3da12b2-c527-43e5-96c7-37b56fb6b22d","Type":"ContainerStarted","Data":"a5ac222842d60df0b62864d10a5232f7560bf839df45c672e38fa1787a5e8633"} Apr 17 07:52:07.001628 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.001594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-62tj8" event={"ID":"e5343931-acc9-4a96-81bd-fc6bbad4d9be","Type":"ContainerStarted","Data":"d9067d09f2a01d6f356576912f6177ba8f150aaac59e7428b56cd5a397fd7548"} Apr 17 07:52:07.002756 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.002734 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-862qc" event={"ID":"954c8165-6465-403b-9b5d-f03c3bcce354","Type":"ContainerStarted","Data":"b98d42ff8caba48889978203250535efd3f83cd90ea1d5703f1ff1a57c13df87"} Apr 17 07:52:07.003956 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.003938 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" event={"ID":"9066e076-2a8d-4930-9ca7-ec84bd3426c6","Type":"ContainerStarted","Data":"f2bddc180032744bad83febab3e2549ab28a70c9d77da8e87dce9f2fa940d78e"} Apr 17 07:52:07.005369 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.005350 2579 generic.go:358] "Generic (PLEG): container finished" podID="83dbb733-5a1c-4565-b720-5b5bf99f74b8" containerID="8bbd0525dc6ee1eaad4cdcd702a71a3d7445042560858625156f1c3a900db44d" exitCode=0 Apr 17 07:52:07.005449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.005411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" event={"ID":"83dbb733-5a1c-4565-b720-5b5bf99f74b8","Type":"ContainerDied","Data":"8bbd0525dc6ee1eaad4cdcd702a71a3d7445042560858625156f1c3a900db44d"} Apr 17 07:52:07.007033 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.007003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dds6t" event={"ID":"4436d074-134b-4347-bddd-5a274dc24549","Type":"ContainerStarted","Data":"a55a55aac8003a27d34a799d537c2f964ba245a1737434028735551dcb65e910"} Apr 17 07:52:07.008247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.008226 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" event={"ID":"9f36b281-ef72-4eb5-963a-ef189f7f1559","Type":"ContainerStarted","Data":"a1b2bae1b9740acd3926eea73d4914dc301fd847946277b95c2a9ffbfbfb94f8"} Apr 17 07:52:07.021658 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.021621 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mvk82" podStartSLOduration=4.278411762 podStartE2EDuration="21.021611543s" podCreationTimestamp="2026-04-17 07:51:46 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.496196639 +0000 UTC m=+3.205800132" lastFinishedPulling="2026-04-17 07:52:06.239396422 +0000 UTC m=+19.948999913" observedRunningTime="2026-04-17 07:52:07.021169317 +0000 UTC m=+20.730772822" watchObservedRunningTime="2026-04-17 07:52:07.021611543 +0000 UTC m=+20.731215055" Apr 17 07:52:07.068469 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.068429 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-862qc" podStartSLOduration=8.617432257 podStartE2EDuration="21.068412963s" podCreationTimestamp="2026-04-17 07:51:46 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.491802961 +0000 UTC m=+3.201406452" lastFinishedPulling="2026-04-17 07:52:01.942783668 +0000 UTC m=+15.652387158" observedRunningTime="2026-04-17 07:52:07.06805753 +0000 UTC m=+20.777661044" watchObservedRunningTime="2026-04-17 07:52:07.068412963 +0000 UTC m=+20.778016474" Apr 17 07:52:07.068598 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.068570 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-62tj8" podStartSLOduration=4.322849371 podStartE2EDuration="21.068564727s" podCreationTimestamp="2026-04-17 07:51:46 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.494087158 +0000 UTC m=+3.203690662" lastFinishedPulling="2026-04-17 07:52:06.239802528 +0000 UTC m=+19.949406018" observedRunningTime="2026-04-17 07:52:07.043952241 +0000 UTC m=+20.753555753" watchObservedRunningTime="2026-04-17 07:52:07.068564727 +0000 UTC m=+20.778168238" Apr 17 07:52:07.101859 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.101816 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dds6t" podStartSLOduration=4.208660464 podStartE2EDuration="21.101802632s" podCreationTimestamp="2026-04-17 07:51:46 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.488557165 +0000 UTC m=+3.198160655" lastFinishedPulling="2026-04-17 07:52:06.381699326 +0000 UTC m=+20.091302823" observedRunningTime="2026-04-17 07:52:07.101655276 +0000 UTC m=+20.811258791" watchObservedRunningTime="2026-04-17 07:52:07.101802632 +0000 UTC m=+20.811406144" Apr 17 07:52:07.135811 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.135759 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fnbbr" podStartSLOduration=3.374094457 podStartE2EDuration="20.13574365s" podCreationTimestamp="2026-04-17 07:51:47 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.486682493 +0000 UTC m=+3.196285983" lastFinishedPulling="2026-04-17 07:52:06.248331682 +0000 UTC m=+19.957935176" observedRunningTime="2026-04-17 07:52:07.135356802 +0000 UTC m=+20.844960311" watchObservedRunningTime="2026-04-17 07:52:07.13574365 +0000 UTC m=+20.845347160" Apr 17 07:52:07.877263 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.877205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:07.877398 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:07.877375 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:07.970861 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:07.970840 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:52:08.011321 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.011294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7m6g5" event={"ID":"2340e8a7-5618-413e-a2cf-f5ff4f985ad1","Type":"ContainerStarted","Data":"e54763409580a967c818366344637e9eb27b164d165a2e28d9f2037a11a143c6"} Apr 17 07:52:08.013656 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.013637 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 07:52:08.014006 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.013984 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"1bc2a8b115dd0445a5bcc438753fa66866135637deb26ec3b2f6061ebc3b2d59"} Apr 17 07:52:08.014082 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.014014 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"1f060de4e56b676f81cbfc11c355d608865a18e6dc8d71e0128e51ae6fd0fa7a"} Apr 17 07:52:08.015427 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.015398 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" event={"ID":"9066e076-2a8d-4930-9ca7-ec84bd3426c6","Type":"ContainerStarted","Data":"d954802feb5a101cfcfb951b06fdb5287815d6aeed2783117608f225c3c3227a"} Apr 17 07:52:08.817290 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.817140 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:52:07.970860032Z","UUID":"c0a40be7-cec5-44e6-9d17-7ea68db91b63","Handler":null,"Name":"","Endpoint":""} Apr 17 07:52:08.819573 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.819327 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:52:08.819573 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.819357 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:52:08.877780 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:08.877739 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:08.877938 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:08.877870 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:09.878046 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:09.877850 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:09.878496 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:09.878158 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:10.022563 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:10.022539 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 07:52:10.022983 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:10.022946 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"87624720fd72e47157acf8f86997c037c2f6f90b57055575d9a21eb028738a7e"} Apr 17 07:52:10.024932 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:10.024909 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" event={"ID":"9066e076-2a8d-4930-9ca7-ec84bd3426c6","Type":"ContainerStarted","Data":"dd7762785ad285c5357ec83bc2e3b543f0eac4e3935565f59a9940f4a857ca60"} Apr 17 07:52:10.053114 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:10.053054 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7m6g5" podStartSLOduration=6.311736123 podStartE2EDuration="23.053043608s" podCreationTimestamp="2026-04-17 07:51:47 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.498411503 +0000 UTC m=+3.208015006" lastFinishedPulling="2026-04-17 07:52:06.239718997 +0000 UTC m=+19.949322491" observedRunningTime="2026-04-17 07:52:08.032764741 +0000 UTC m=+21.742368252" watchObservedRunningTime="2026-04-17 07:52:10.053043608 +0000 UTC m=+23.762647122" Apr 17 07:52:10.053362 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:10.053339 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n6k5h" podStartSLOduration=4.276605287 podStartE2EDuration="24.053333861s" podCreationTimestamp="2026-04-17 07:51:46 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.490622881 +0000 UTC m=+3.200226371" lastFinishedPulling="2026-04-17 07:52:09.267351455 +0000 UTC m=+22.976954945" observedRunningTime="2026-04-17 07:52:10.052674065 +0000 UTC m=+23.762277576" watchObservedRunningTime="2026-04-17 07:52:10.053333861 +0000 UTC m=+23.762937373" Apr 17 07:52:10.877933 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:10.877902 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:10.878115 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:10.878017 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:11.580530 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:11.580500 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-862qc" Apr 17 07:52:11.581113 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:11.581094 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-862qc" Apr 17 07:52:11.877666 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:11.877588 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:11.877796 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:11.877720 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:12.028296 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:12.028274 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-862qc" Apr 17 07:52:12.028741 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:12.028706 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-862qc" Apr 17 07:52:12.877137 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:12.876965 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:12.877324 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:12.877201 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:13.031118 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.031085 2579 generic.go:358] "Generic (PLEG): container finished" podID="83dbb733-5a1c-4565-b720-5b5bf99f74b8" containerID="bf3436e104f1e98d11b12f02cb1d082c5e58777f5a99534fa08334af6595bdb8" exitCode=0 Apr 17 07:52:13.031589 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.031173 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" event={"ID":"83dbb733-5a1c-4565-b720-5b5bf99f74b8","Type":"ContainerDied","Data":"bf3436e104f1e98d11b12f02cb1d082c5e58777f5a99534fa08334af6595bdb8"} Apr 17 07:52:13.034194 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.034100 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 07:52:13.034470 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.034434 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"594b928c451743b2df23ef712e374069cb08eda4dc00fcf82aed3b5043819a0c"} Apr 17 07:52:13.034926 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.034911 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:52:13.034996 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.034936 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:52:13.035057 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.035014 2579 scope.go:117] "RemoveContainer" containerID="d2cabb4cb7b6679d525e76baff2bc0f1ba3e6baf161c9b89ce2a4ec7514ce829" Apr 17 07:52:13.050901 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.050859 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:52:13.877333 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:13.877306 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:13.877453 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:13.877412 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:14.004772 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.004693 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ml85w"] Apr 17 07:52:14.004943 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.004817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:14.004943 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:14.004931 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:14.007494 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.007467 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qq9zp"] Apr 17 07:52:14.038990 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.038961 2579 generic.go:358] "Generic (PLEG): container finished" podID="83dbb733-5a1c-4565-b720-5b5bf99f74b8" containerID="b142a30572f62238a8c80b893fac5e5d182bb9b8bc3924d6e93d8256550519f0" exitCode=0 Apr 17 07:52:14.039399 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.039043 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" event={"ID":"83dbb733-5a1c-4565-b720-5b5bf99f74b8","Type":"ContainerDied","Data":"b142a30572f62238a8c80b893fac5e5d182bb9b8bc3924d6e93d8256550519f0"} Apr 17 07:52:14.043162 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.043147 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 07:52:14.043478 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.043454 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" event={"ID":"d966014f-f7ed-4082-afdb-6e81d3b82816","Type":"ContainerStarted","Data":"94f11031d970a1bab34a1d96928cba63169a606cf0fa8a3c693908dcba592259"} Apr 17 07:52:14.043550 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.043494 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:14.043615 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:14.043592 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:14.044269 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.044236 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:52:14.060205 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.060184 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:52:14.103082 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:14.103041 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" podStartSLOduration=10.157215175 podStartE2EDuration="27.103028801s" podCreationTimestamp="2026-04-17 07:51:47 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.496150485 +0000 UTC m=+3.205753975" lastFinishedPulling="2026-04-17 07:52:06.441964095 +0000 UTC m=+20.151567601" observedRunningTime="2026-04-17 07:52:14.103020401 +0000 UTC m=+27.812623916" watchObservedRunningTime="2026-04-17 07:52:14.103028801 +0000 UTC m=+27.812632313" Apr 17 07:52:15.047521 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:15.047322 2579 generic.go:358] "Generic (PLEG): container finished" podID="83dbb733-5a1c-4565-b720-5b5bf99f74b8" containerID="cbebdb4b20e965fef180a48b3a1d62b5ea2c68fe0a3e61807414eeb31ea33bca" exitCode=0 Apr 17 07:52:15.047982 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:15.047396 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" event={"ID":"83dbb733-5a1c-4565-b720-5b5bf99f74b8","Type":"ContainerDied","Data":"cbebdb4b20e965fef180a48b3a1d62b5ea2c68fe0a3e61807414eeb31ea33bca"} Apr 17 07:52:15.877483 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:15.877402 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:15.877626 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:15.877402 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:15.877626 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:15.877531 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:15.877626 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:15.877600 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:17.877951 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:17.877920 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:17.878581 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:17.878039 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml85w" podUID="45bb327f-877b-4da6-8c2d-e760eb62707d" Apr 17 07:52:17.878581 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:17.877920 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:17.878581 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:17.878516 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:52:19.080635 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.080595 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-45.ec2.internal" event="NodeReady" Apr 17 07:52:19.081285 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.080745 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:52:19.131175 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.131140 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lrw8v"] Apr 17 07:52:19.135219 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.135186 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.136928 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.136896 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bh4vm"] Apr 17 07:52:19.139662 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.139643 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:19.141764 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.141728 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:52:19.141915 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.141777 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l9k4x\"" Apr 17 07:52:19.141915 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.141737 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:52:19.145679 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.145325 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:52:19.145679 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.145632 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:52:19.146385 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.146351 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tgj8b\"" Apr 17 07:52:19.146469 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.146418 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:52:19.151277 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.151255 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lrw8v"] Apr 17 07:52:19.151990 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.151967 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bh4vm"] Apr 17 07:52:19.309131 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.309043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-config-volume\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.309131 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.309091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:19.309131 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.309124 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qrj9\" (UniqueName: \"kubernetes.io/projected/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-kube-api-access-9qrj9\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:19.309422 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.309190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6588\" (UniqueName: \"kubernetes.io/projected/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-kube-api-access-l6588\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.309422 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.309228 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-tmp-dir\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.309422 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.309277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.410326 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.410292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-config-volume\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.410517 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.410337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:19.410517 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.410375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrj9\" (UniqueName: \"kubernetes.io/projected/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-kube-api-access-9qrj9\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:19.410517 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:19.410481 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:19.410664 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:19.410548 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert podName:d4301eb7-7ba7-42b5-961e-ef10f1fe7955 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:19.91052625 +0000 UTC m=+33.620129767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert") pod "ingress-canary-bh4vm" (UID: "d4301eb7-7ba7-42b5-961e-ef10f1fe7955") : secret "canary-serving-cert" not found Apr 17 07:52:19.410664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.410600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6588\" (UniqueName: \"kubernetes.io/projected/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-kube-api-access-l6588\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.410664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.410628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-tmp-dir\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.410664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.410655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.410853 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:19.410748 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:19.410853 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:19.410790 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls podName:61a4efa4-9ac5-47c3-ba8a-6fa191936c56 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:19.910776946 +0000 UTC m=+33.620380437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls") pod "dns-default-lrw8v" (UID: "61a4efa4-9ac5-47c3-ba8a-6fa191936c56") : secret "dns-default-metrics-tls" not found Apr 17 07:52:19.411010 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.410990 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-tmp-dir\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.411010 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.411002 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-config-volume\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.426177 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.426150 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6588\" (UniqueName: \"kubernetes.io/projected/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-kube-api-access-l6588\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.426327 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.426215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qrj9\" (UniqueName: \"kubernetes.io/projected/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-kube-api-access-9qrj9\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:19.877391 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.877353 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:19.877579 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.877353 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:19.886895 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.886866 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqz2q\"" Apr 17 07:52:19.887014 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.886919 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:19.887014 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.886927 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:19.887121 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.886907 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:19.887121 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.886867 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4xllf\"" Apr 17 07:52:19.913443 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.913406 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:19.913597 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:19.913474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:19.913662 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:19.913634 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:19.913718 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:19.913690 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert podName:d4301eb7-7ba7-42b5-961e-ef10f1fe7955 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:20.913671845 +0000 UTC m=+34.623275353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert") pod "ingress-canary-bh4vm" (UID: "d4301eb7-7ba7-42b5-961e-ef10f1fe7955") : secret "canary-serving-cert" not found Apr 17 07:52:19.914117 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:19.914100 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:19.914202 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:19.914147 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls podName:61a4efa4-9ac5-47c3-ba8a-6fa191936c56 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:20.914133555 +0000 UTC m=+34.623737050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls") pod "dns-default-lrw8v" (UID: "61a4efa4-9ac5-47c3-ba8a-6fa191936c56") : secret "dns-default-metrics-tls" not found Apr 17 07:52:20.518539 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:20.518497 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:20.519071 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:20.518645 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:20.519134 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:20.519101 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:52.518702405 +0000 UTC m=+66.228305895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : secret "metrics-daemon-secret" not found Apr 17 07:52:20.619457 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:20.619421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:20.622042 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:20.622013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72nml\" (UniqueName: \"kubernetes.io/projected/45bb327f-877b-4da6-8c2d-e760eb62707d-kube-api-access-72nml\") pod \"network-check-target-ml85w\" (UID: \"45bb327f-877b-4da6-8c2d-e760eb62707d\") " pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:20.788925 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:20.788831 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:20.922016 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:20.921544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:20.922016 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:20.921820 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:20.922016 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:20.921678 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:20.922016 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:20.922003 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls podName:61a4efa4-9ac5-47c3-ba8a-6fa191936c56 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:22.921990836 +0000 UTC m=+36.631594327 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls") pod "dns-default-lrw8v" (UID: "61a4efa4-9ac5-47c3-ba8a-6fa191936c56") : secret "dns-default-metrics-tls" not found Apr 17 07:52:20.922306 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:20.921954 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:20.922374 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:20.922315 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert podName:d4301eb7-7ba7-42b5-961e-ef10f1fe7955 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:22.92230199 +0000 UTC m=+36.631905502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert") pod "ingress-canary-bh4vm" (UID: "d4301eb7-7ba7-42b5-961e-ef10f1fe7955") : secret "canary-serving-cert" not found Apr 17 07:52:21.045454 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:21.045390 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ml85w"] Apr 17 07:52:21.050381 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:52:21.050336 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45bb327f_877b_4da6_8c2d_e760eb62707d.slice/crio-92eb6a5d88367c31b84b64dac691d47d029814fe4a3e8b1bb611ac197eb5a092 WatchSource:0}: Error finding container 92eb6a5d88367c31b84b64dac691d47d029814fe4a3e8b1bb611ac197eb5a092: Status 404 returned error can't find the container with id 92eb6a5d88367c31b84b64dac691d47d029814fe4a3e8b1bb611ac197eb5a092 Apr 17 07:52:21.060047 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:21.060024 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ml85w" event={"ID":"45bb327f-877b-4da6-8c2d-e760eb62707d","Type":"ContainerStarted","Data":"92eb6a5d88367c31b84b64dac691d47d029814fe4a3e8b1bb611ac197eb5a092"} Apr 17 07:52:22.065060 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:22.065022 2579 generic.go:358] "Generic (PLEG): container finished" podID="83dbb733-5a1c-4565-b720-5b5bf99f74b8" containerID="1e90b65ea6742e5318ce8450a221bddac4c2a8ed2388cf665b49230830688c99" exitCode=0 Apr 17 07:52:22.065581 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:22.065084 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" event={"ID":"83dbb733-5a1c-4565-b720-5b5bf99f74b8","Type":"ContainerDied","Data":"1e90b65ea6742e5318ce8450a221bddac4c2a8ed2388cf665b49230830688c99"} Apr 17 07:52:22.938836 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:22.938803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:22.939047 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:22.938860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:22.939047 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:22.938979 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:22.939047 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:22.938987 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:22.939047 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:22.939042 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls podName:61a4efa4-9ac5-47c3-ba8a-6fa191936c56 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.939024874 +0000 UTC m=+40.648628365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls") pod "dns-default-lrw8v" (UID: "61a4efa4-9ac5-47c3-ba8a-6fa191936c56") : secret "dns-default-metrics-tls" not found Apr 17 07:52:22.939251 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:22.939061 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert podName:d4301eb7-7ba7-42b5-961e-ef10f1fe7955 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.939053737 +0000 UTC m=+40.648657227 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert") pod "ingress-canary-bh4vm" (UID: "d4301eb7-7ba7-42b5-961e-ef10f1fe7955") : secret "canary-serving-cert" not found Apr 17 07:52:23.070182 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:23.070146 2579 generic.go:358] "Generic (PLEG): container finished" podID="83dbb733-5a1c-4565-b720-5b5bf99f74b8" containerID="78f8bc101e8865a0b24282191b7409f21c80675c83876feb8a3e8f7a913bc02c" exitCode=0 Apr 17 07:52:23.070643 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:23.070218 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" event={"ID":"83dbb733-5a1c-4565-b720-5b5bf99f74b8","Type":"ContainerDied","Data":"78f8bc101e8865a0b24282191b7409f21c80675c83876feb8a3e8f7a913bc02c"} Apr 17 07:52:24.075233 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:24.074989 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" event={"ID":"83dbb733-5a1c-4565-b720-5b5bf99f74b8","Type":"ContainerStarted","Data":"f30d9b2a223c5e1bfd2b23e31be54ae0e827d0eea7f195b951862faab5147dd8"} Apr 17 07:52:24.110104 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:24.110063 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wg6r2" podStartSLOduration=6.732391756 podStartE2EDuration="38.110049924s" podCreationTimestamp="2026-04-17 07:51:46 +0000 UTC" firstStartedPulling="2026-04-17 07:51:49.491636234 +0000 UTC m=+3.201239725" lastFinishedPulling="2026-04-17 07:52:20.869294402 +0000 UTC m=+34.578897893" observedRunningTime="2026-04-17 07:52:24.105427338 +0000 UTC m=+37.815030852" watchObservedRunningTime="2026-04-17 07:52:24.110049924 +0000 UTC m=+37.819653437" Apr 17 07:52:25.077796 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:25.077761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ml85w" event={"ID":"45bb327f-877b-4da6-8c2d-e760eb62707d","Type":"ContainerStarted","Data":"ee80916f719607c81b08198fb722f2b5ce7997a9dbebae76e2918a03d2cc1c87"} Apr 17 07:52:25.098543 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:25.098503 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ml85w" podStartSLOduration=35.142632672 podStartE2EDuration="38.098491005s" podCreationTimestamp="2026-04-17 07:51:47 +0000 UTC" firstStartedPulling="2026-04-17 07:52:21.052225815 +0000 UTC m=+34.761829305" lastFinishedPulling="2026-04-17 07:52:24.008084138 +0000 UTC m=+37.717687638" observedRunningTime="2026-04-17 07:52:25.098109398 +0000 UTC m=+38.807712911" watchObservedRunningTime="2026-04-17 07:52:25.098491005 +0000 UTC m=+38.808094517" Apr 17 07:52:26.080057 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:26.080022 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:52:26.967334 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:26.967289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:26.967334 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:26.967341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:26.967532 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:26.967435 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:26.967532 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:26.967453 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:26.967532 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:26.967490 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert podName:d4301eb7-7ba7-42b5-961e-ef10f1fe7955 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:34.967476889 +0000 UTC m=+48.677080379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert") pod "ingress-canary-bh4vm" (UID: "d4301eb7-7ba7-42b5-961e-ef10f1fe7955") : secret "canary-serving-cert" not found Apr 17 07:52:26.967532 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:26.967509 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls podName:61a4efa4-9ac5-47c3-ba8a-6fa191936c56 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:34.967496398 +0000 UTC m=+48.677099888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls") pod "dns-default-lrw8v" (UID: "61a4efa4-9ac5-47c3-ba8a-6fa191936c56") : secret "dns-default-metrics-tls" not found Apr 17 07:52:35.020635 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:35.020595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:35.021133 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:35.020644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:35.021133 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:35.020784 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:35.021133 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:35.020800 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:35.021133 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:35.020847 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls podName:61a4efa4-9ac5-47c3-ba8a-6fa191936c56 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:51.020831883 +0000 UTC m=+64.730435373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls") pod "dns-default-lrw8v" (UID: "61a4efa4-9ac5-47c3-ba8a-6fa191936c56") : secret "dns-default-metrics-tls" not found Apr 17 07:52:35.021133 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:35.020865 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert podName:d4301eb7-7ba7-42b5-961e-ef10f1fe7955 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:51.020854649 +0000 UTC m=+64.730458139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert") pod "ingress-canary-bh4vm" (UID: "d4301eb7-7ba7-42b5-961e-ef10f1fe7955") : secret "canary-serving-cert" not found Apr 17 07:52:46.070268 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:46.070233 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ldxs6" Apr 17 07:52:51.115005 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:51.114956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:52:51.115502 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:51.115026 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:52:51.115502 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:51.115092 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:51.115502 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:51.115101 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:51.115502 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:51.115154 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls podName:61a4efa4-9ac5-47c3-ba8a-6fa191936c56 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:23.115140067 +0000 UTC m=+96.824743557 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls") pod "dns-default-lrw8v" (UID: "61a4efa4-9ac5-47c3-ba8a-6fa191936c56") : secret "dns-default-metrics-tls" not found Apr 17 07:52:51.115502 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:51.115167 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert podName:d4301eb7-7ba7-42b5-961e-ef10f1fe7955 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:23.115160968 +0000 UTC m=+96.824764458 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert") pod "ingress-canary-bh4vm" (UID: "d4301eb7-7ba7-42b5-961e-ef10f1fe7955") : secret "canary-serving-cert" not found Apr 17 07:52:52.523803 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:52.523751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:52:52.524189 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:52.523926 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:52.524189 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:52:52.523993 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:53:56.523974913 +0000 UTC m=+130.233578420 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : secret "metrics-daemon-secret" not found Apr 17 07:52:57.083829 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:52:57.083796 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ml85w" Apr 17 07:53:23.138781 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:53:23.138623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:53:23.138781 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:53:23.138688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:53:23.138781 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:53:23.138785 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:23.139334 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:53:23.138789 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:23.139334 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:53:23.138838 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert podName:d4301eb7-7ba7-42b5-961e-ef10f1fe7955 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:27.138824193 +0000 UTC m=+160.848427684 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert") pod "ingress-canary-bh4vm" (UID: "d4301eb7-7ba7-42b5-961e-ef10f1fe7955") : secret "canary-serving-cert" not found Apr 17 07:53:23.139334 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:53:23.138851 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls podName:61a4efa4-9ac5-47c3-ba8a-6fa191936c56 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:27.138845715 +0000 UTC m=+160.848449205 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls") pod "dns-default-lrw8v" (UID: "61a4efa4-9ac5-47c3-ba8a-6fa191936c56") : secret "dns-default-metrics-tls" not found Apr 17 07:53:56.568305 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:53:56.568265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:53:56.568809 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:53:56.568368 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:53:56.568809 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:53:56.568426 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs podName:2b638776-5000-47ee-92c9-8ba7655f560c nodeName:}" failed. No retries permitted until 2026-04-17 07:55:58.568412433 +0000 UTC m=+252.278015924 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs") pod "network-metrics-daemon-qq9zp" (UID: "2b638776-5000-47ee-92c9-8ba7655f560c") : secret "metrics-daemon-secret" not found Apr 17 07:54:01.103384 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.103340 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq"] Apr 17 07:54:01.106080 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.106064 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq" Apr 17 07:54:01.108359 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.108340 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zxgsh\"" Apr 17 07:54:01.109034 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.109013 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:01.109133 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.109018 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:01.113501 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.113480 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq"] Apr 17 07:54:01.201346 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.201306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl56c\" (UniqueName: \"kubernetes.io/projected/a893b496-3c99-4d44-969c-deb90700402f-kube-api-access-tl56c\") pod \"volume-data-source-validator-7c6cbb6c87-8fnbq\" (UID: \"a893b496-3c99-4d44-969c-deb90700402f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq" Apr 17 07:54:01.204527 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.204502 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp"] Apr 17 07:54:01.207304 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.207279 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp" Apr 17 07:54:01.207988 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.207967 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-558bd67658-kz8wz"] Apr 17 07:54:01.209467 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.209444 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-jf4fk\"" Apr 17 07:54:01.210501 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.210481 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.212565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.212547 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 07:54:01.212653 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.212593 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 07:54:01.212824 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.212807 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 07:54:01.212990 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.212971 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6b9tl\"" Apr 17 07:54:01.239044 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.222185 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 07:54:01.239044 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.223401 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp"] Apr 17 07:54:01.239044 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.229049 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-558bd67658-kz8wz"] Apr 17 07:54:01.302003 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.301971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.302003 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-trusted-ca\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.302212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-image-registry-private-configuration\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.302212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302093 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6a1cde0-f02d-40ca-91b3-342b21f46121-ca-trust-extracted\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.302212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-installation-pull-secrets\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.302212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mxv6\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-kube-api-access-9mxv6\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.302212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302176 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-certificates\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.302212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl56c\" (UniqueName: \"kubernetes.io/projected/a893b496-3c99-4d44-969c-deb90700402f-kube-api-access-tl56c\") pod \"volume-data-source-validator-7c6cbb6c87-8fnbq\" (UID: \"a893b496-3c99-4d44-969c-deb90700402f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq" Apr 17 07:54:01.302468 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302291 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6s4b\" (UniqueName: \"kubernetes.io/projected/93dafc8e-891d-4f40-b29c-0e82ac63515a-kube-api-access-k6s4b\") pod \"network-check-source-8894fc9bd-4d2hp\" (UID: \"93dafc8e-891d-4f40-b29c-0e82ac63515a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp" Apr 17 07:54:01.302468 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.302332 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-bound-sa-token\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.315283 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.315251 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw"] Apr 17 07:54:01.318017 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.318000 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7wts4"] Apr 17 07:54:01.318163 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.318146 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.320647 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.320629 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 07:54:01.320756 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.320704 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.321111 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.321093 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hnr6b\"" Apr 17 07:54:01.321443 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.321423 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:54:01.321949 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.321923 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 07:54:01.322250 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.322235 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:54:01.322597 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.322581 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-wjxmr\"" Apr 17 07:54:01.322679 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.322601 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 07:54:01.324414 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.324378 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 07:54:01.324562 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.324413 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:01.324562 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.324429 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:01.328162 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.328143 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw"] Apr 17 07:54:01.329705 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.329686 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 07:54:01.331324 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.331303 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl56c\" (UniqueName: \"kubernetes.io/projected/a893b496-3c99-4d44-969c-deb90700402f-kube-api-access-tl56c\") pod \"volume-data-source-validator-7c6cbb6c87-8fnbq\" (UID: \"a893b496-3c99-4d44-969c-deb90700402f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq" Apr 17 07:54:01.334831 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.334796 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7wts4"] Apr 17 07:54:01.402963 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.402939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.403098 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.402966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-trusted-ca\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.403098 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.402991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbnv\" (UniqueName: \"kubernetes.io/projected/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-kube-api-access-5cbnv\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.403098 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-image-registry-private-configuration\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.403098 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:01.403083 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:01.403098 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:01.403097 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558bd67658-kz8wz: secret "image-registry-tls" not found Apr 17 07:54:01.403292 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:01.403156 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls podName:d6a1cde0-f02d-40ca-91b3-342b21f46121 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:01.903137153 +0000 UTC m=+135.612740646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls") pod "image-registry-558bd67658-kz8wz" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121") : secret "image-registry-tls" not found Apr 17 07:54:01.403292 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403081 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wxdg\" (UniqueName: \"kubernetes.io/projected/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-kube-api-access-8wxdg\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.403292 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403208 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6a1cde0-f02d-40ca-91b3-342b21f46121-ca-trust-extracted\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.403292 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403238 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-serving-cert\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.403292 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-installation-pull-secrets\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.403292 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403276 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mxv6\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-kube-api-access-9mxv6\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.403587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403302 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.403587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403334 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-certificates\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.403587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403371 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-config\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.403587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403409 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-trusted-ca\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.403587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.403587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403469 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6s4b\" (UniqueName: \"kubernetes.io/projected/93dafc8e-891d-4f40-b29c-0e82ac63515a-kube-api-access-k6s4b\") pod \"network-check-source-8894fc9bd-4d2hp\" (UID: \"93dafc8e-891d-4f40-b29c-0e82ac63515a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp" Apr 17 07:54:01.403587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403495 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-bound-sa-token\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.403863 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.403612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6a1cde0-f02d-40ca-91b3-342b21f46121-ca-trust-extracted\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.404391 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.404367 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-certificates\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.404436 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.404398 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-trusted-ca\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.405717 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.405700 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-image-registry-private-configuration\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.405808 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.405732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-installation-pull-secrets\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.411497 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.411476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6s4b\" (UniqueName: \"kubernetes.io/projected/93dafc8e-891d-4f40-b29c-0e82ac63515a-kube-api-access-k6s4b\") pod \"network-check-source-8894fc9bd-4d2hp\" (UID: \"93dafc8e-891d-4f40-b29c-0e82ac63515a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp" Apr 17 07:54:01.411610 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.411594 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mxv6\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-kube-api-access-9mxv6\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.413610 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.413586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-bound-sa-token\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.415387 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.415372 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq" Apr 17 07:54:01.504399 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.504369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.504399 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.504411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-config\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.504590 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:01.504551 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:01.504627 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.504588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-trusted-ca\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.504669 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:01.504629 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls podName:81317276-c9cb-47e7-a1e6-ff72bed6bfc7 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.004606554 +0000 UTC m=+135.714210065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-plmjw" (UID: "81317276-c9cb-47e7-a1e6-ff72bed6bfc7") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:01.504723 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.504677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.504774 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.504737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbnv\" (UniqueName: \"kubernetes.io/projected/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-kube-api-access-5cbnv\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.504831 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.504778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wxdg\" (UniqueName: \"kubernetes.io/projected/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-kube-api-access-8wxdg\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.504878 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.504837 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-serving-cert\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.505223 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.505175 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-config\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.505610 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.505589 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-trusted-ca\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.506064 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.506029 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.507717 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.507699 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-serving-cert\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.512384 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.512359 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wxdg\" (UniqueName: \"kubernetes.io/projected/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-kube-api-access-8wxdg\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:01.512533 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.512513 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbnv\" (UniqueName: \"kubernetes.io/projected/e3b4cab1-15d3-4640-89c6-ec91e734f2fd-kube-api-access-5cbnv\") pod \"console-operator-9d4b6777b-7wts4\" (UID: \"e3b4cab1-15d3-4640-89c6-ec91e734f2fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.519276 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.519254 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp" Apr 17 07:54:01.531091 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.531070 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq"] Apr 17 07:54:01.534355 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:01.534330 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda893b496_3c99_4d44_969c_deb90700402f.slice/crio-ecd96ff2bc132b4dd3b8b35cc6d4bf7ad1f122e2698370fba21e7a292cec0612 WatchSource:0}: Error finding container ecd96ff2bc132b4dd3b8b35cc6d4bf7ad1f122e2698370fba21e7a292cec0612: Status 404 returned error can't find the container with id ecd96ff2bc132b4dd3b8b35cc6d4bf7ad1f122e2698370fba21e7a292cec0612 Apr 17 07:54:01.628801 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.628768 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp"] Apr 17 07:54:01.631488 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:01.631463 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93dafc8e_891d_4f40_b29c_0e82ac63515a.slice/crio-c78d1a476593455e755fa04755b36f100663a9c088a0a1318a56326559d136cf WatchSource:0}: Error finding container c78d1a476593455e755fa04755b36f100663a9c088a0a1318a56326559d136cf: Status 404 returned error can't find the container with id c78d1a476593455e755fa04755b36f100663a9c088a0a1318a56326559d136cf Apr 17 07:54:01.640041 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.640020 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:01.756151 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.756120 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7wts4"] Apr 17 07:54:01.759749 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:01.759724 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b4cab1_15d3_4640_89c6_ec91e734f2fd.slice/crio-e306196ce1f929e7c2859bf13114e439a21e803e2c980129c32b19cdc5f5b9d2 WatchSource:0}: Error finding container e306196ce1f929e7c2859bf13114e439a21e803e2c980129c32b19cdc5f5b9d2: Status 404 returned error can't find the container with id e306196ce1f929e7c2859bf13114e439a21e803e2c980129c32b19cdc5f5b9d2 Apr 17 07:54:01.908290 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:01.908248 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:01.908432 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:01.908389 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:01.908432 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:01.908407 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558bd67658-kz8wz: secret "image-registry-tls" not found Apr 17 07:54:01.908534 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:01.908467 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls podName:d6a1cde0-f02d-40ca-91b3-342b21f46121 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.908450263 +0000 UTC m=+136.618053770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls") pod "image-registry-558bd67658-kz8wz" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121") : secret "image-registry-tls" not found Apr 17 07:54:02.009419 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:02.009340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:02.009580 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:02.009536 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:02.009656 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:02.009641 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls podName:81317276-c9cb-47e7-a1e6-ff72bed6bfc7 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:03.0096187 +0000 UTC m=+136.719222207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-plmjw" (UID: "81317276-c9cb-47e7-a1e6-ff72bed6bfc7") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:02.254937 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:02.254842 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq" event={"ID":"a893b496-3c99-4d44-969c-deb90700402f","Type":"ContainerStarted","Data":"ecd96ff2bc132b4dd3b8b35cc6d4bf7ad1f122e2698370fba21e7a292cec0612"} Apr 17 07:54:02.256422 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:02.256391 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp" event={"ID":"93dafc8e-891d-4f40-b29c-0e82ac63515a","Type":"ContainerStarted","Data":"af8bd7bf852eccd5b6816f11482fa74c76bde4ef1de1e1f34e7933db2e37206d"} Apr 17 07:54:02.256557 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:02.256429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp" event={"ID":"93dafc8e-891d-4f40-b29c-0e82ac63515a","Type":"ContainerStarted","Data":"c78d1a476593455e755fa04755b36f100663a9c088a0a1318a56326559d136cf"} Apr 17 07:54:02.257640 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:02.257619 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" event={"ID":"e3b4cab1-15d3-4640-89c6-ec91e734f2fd","Type":"ContainerStarted","Data":"e306196ce1f929e7c2859bf13114e439a21e803e2c980129c32b19cdc5f5b9d2"} Apr 17 07:54:02.271865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:02.271773 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d2hp" podStartSLOduration=1.271756535 podStartE2EDuration="1.271756535s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:02.271093223 +0000 UTC m=+135.980696736" watchObservedRunningTime="2026-04-17 07:54:02.271756535 +0000 UTC m=+135.981360038" Apr 17 07:54:02.915566 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:02.915474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:02.915727 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:02.915653 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:02.915727 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:02.915675 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558bd67658-kz8wz: secret "image-registry-tls" not found Apr 17 07:54:02.915832 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:02.915742 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls podName:d6a1cde0-f02d-40ca-91b3-342b21f46121 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:04.915720028 +0000 UTC m=+138.625323522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls") pod "image-registry-558bd67658-kz8wz" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121") : secret "image-registry-tls" not found Apr 17 07:54:03.016139 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:03.016083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:03.016311 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:03.016227 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:03.016311 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:03.016292 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls podName:81317276-c9cb-47e7-a1e6-ff72bed6bfc7 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:05.016272167 +0000 UTC m=+138.725875660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-plmjw" (UID: "81317276-c9cb-47e7-a1e6-ff72bed6bfc7") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:03.261556 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:03.261465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq" event={"ID":"a893b496-3c99-4d44-969c-deb90700402f","Type":"ContainerStarted","Data":"93a88b333deb41e918358e58939d139d348ed0e3c7854f961e07715629a0d8a0"} Apr 17 07:54:03.276268 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:03.276220 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8fnbq" podStartSLOduration=1.147387748 podStartE2EDuration="2.276205247s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:01.536269753 +0000 UTC m=+135.245873246" lastFinishedPulling="2026-04-17 07:54:02.665087236 +0000 UTC m=+136.374690745" observedRunningTime="2026-04-17 07:54:03.275726493 +0000 UTC m=+136.985330008" watchObservedRunningTime="2026-04-17 07:54:03.276205247 +0000 UTC m=+136.985808760" Apr 17 07:54:04.264109 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:04.264082 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/0.log" Apr 17 07:54:04.264462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:04.264121 2579 generic.go:358] "Generic (PLEG): container finished" podID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" containerID="23ebbd454bfc2e6b4632f8c345cf8fc5c3366adbc76558dfef3826d03b4b2de9" exitCode=255 Apr 17 07:54:04.264462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:04.264236 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" event={"ID":"e3b4cab1-15d3-4640-89c6-ec91e734f2fd","Type":"ContainerDied","Data":"23ebbd454bfc2e6b4632f8c345cf8fc5c3366adbc76558dfef3826d03b4b2de9"} Apr 17 07:54:04.264462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:04.264404 2579 scope.go:117] "RemoveContainer" containerID="23ebbd454bfc2e6b4632f8c345cf8fc5c3366adbc76558dfef3826d03b4b2de9" Apr 17 07:54:04.933168 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:04.933135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:04.933307 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:04.933261 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:04.933307 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:04.933272 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558bd67658-kz8wz: secret "image-registry-tls" not found Apr 17 07:54:04.933395 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:04.933317 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls podName:d6a1cde0-f02d-40ca-91b3-342b21f46121 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:08.933303994 +0000 UTC m=+142.642907485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls") pod "image-registry-558bd67658-kz8wz" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121") : secret "image-registry-tls" not found Apr 17 07:54:05.034322 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.034283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:05.034423 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:05.034393 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:05.034473 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:05.034443 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls podName:81317276-c9cb-47e7-a1e6-ff72bed6bfc7 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:09.034429819 +0000 UTC m=+142.744033310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-plmjw" (UID: "81317276-c9cb-47e7-a1e6-ff72bed6bfc7") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:05.267436 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.267366 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/1.log" Apr 17 07:54:05.267775 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.267755 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/0.log" Apr 17 07:54:05.267809 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.267786 2579 generic.go:358] "Generic (PLEG): container finished" podID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" containerID="40a337629a2a48c6f254ce93cd7db445fc72fa033660343d292a9752f1c0038a" exitCode=255 Apr 17 07:54:05.267838 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.267822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" event={"ID":"e3b4cab1-15d3-4640-89c6-ec91e734f2fd","Type":"ContainerDied","Data":"40a337629a2a48c6f254ce93cd7db445fc72fa033660343d292a9752f1c0038a"} Apr 17 07:54:05.267876 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.267848 2579 scope.go:117] "RemoveContainer" containerID="23ebbd454bfc2e6b4632f8c345cf8fc5c3366adbc76558dfef3826d03b4b2de9" Apr 17 07:54:05.268127 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.268101 2579 scope.go:117] "RemoveContainer" containerID="40a337629a2a48c6f254ce93cd7db445fc72fa033660343d292a9752f1c0038a" Apr 17 07:54:05.268311 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:05.268290 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7wts4_openshift-console-operator(e3b4cab1-15d3-4640-89c6-ec91e734f2fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" podUID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" Apr 17 07:54:05.400357 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.400326 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v"] Apr 17 07:54:05.404485 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.404461 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" Apr 17 07:54:05.406686 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.406665 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:05.406811 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.406726 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 07:54:05.406811 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.406766 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-tsdhh\"" Apr 17 07:54:05.419715 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.419692 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v"] Apr 17 07:54:05.538639 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.538560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6jd9\" (UniqueName: \"kubernetes.io/projected/b7e592dd-647e-4a90-b403-e187b9b2475f-kube-api-access-f6jd9\") pod \"migrator-74bb7799d9-kbg5v\" (UID: \"b7e592dd-647e-4a90-b403-e187b9b2475f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" Apr 17 07:54:05.639874 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.639828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6jd9\" (UniqueName: \"kubernetes.io/projected/b7e592dd-647e-4a90-b403-e187b9b2475f-kube-api-access-f6jd9\") pod \"migrator-74bb7799d9-kbg5v\" (UID: \"b7e592dd-647e-4a90-b403-e187b9b2475f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" Apr 17 07:54:05.650047 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.650021 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6jd9\" (UniqueName: \"kubernetes.io/projected/b7e592dd-647e-4a90-b403-e187b9b2475f-kube-api-access-f6jd9\") pod \"migrator-74bb7799d9-kbg5v\" (UID: \"b7e592dd-647e-4a90-b403-e187b9b2475f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" Apr 17 07:54:05.712826 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.712795 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" Apr 17 07:54:05.828121 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:05.828050 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v"] Apr 17 07:54:05.830702 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:05.830673 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e592dd_647e_4a90_b403_e187b9b2475f.slice/crio-64bee9d8378ff5e6961303ccaf86dfee598aa545fab62341f5c845c5233be2f2 WatchSource:0}: Error finding container 64bee9d8378ff5e6961303ccaf86dfee598aa545fab62341f5c845c5233be2f2: Status 404 returned error can't find the container with id 64bee9d8378ff5e6961303ccaf86dfee598aa545fab62341f5c845c5233be2f2 Apr 17 07:54:06.271181 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:06.271149 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/1.log" Apr 17 07:54:06.271625 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:06.271539 2579 scope.go:117] "RemoveContainer" containerID="40a337629a2a48c6f254ce93cd7db445fc72fa033660343d292a9752f1c0038a" Apr 17 07:54:06.271743 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:06.271722 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7wts4_openshift-console-operator(e3b4cab1-15d3-4640-89c6-ec91e734f2fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" podUID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" Apr 17 07:54:06.272310 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:06.272289 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" event={"ID":"b7e592dd-647e-4a90-b403-e187b9b2475f","Type":"ContainerStarted","Data":"64bee9d8378ff5e6961303ccaf86dfee598aa545fab62341f5c845c5233be2f2"} Apr 17 07:54:07.241440 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:07.241410 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-62tj8_e5343931-acc9-4a96-81bd-fc6bbad4d9be/dns-node-resolver/0.log" Apr 17 07:54:07.276303 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:07.276270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" event={"ID":"b7e592dd-647e-4a90-b403-e187b9b2475f","Type":"ContainerStarted","Data":"e72b4386bfaca5244df757f64ba1f946c270d077506aebfaa2e37ab7f1f7c0a7"} Apr 17 07:54:07.276628 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:07.276307 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" event={"ID":"b7e592dd-647e-4a90-b403-e187b9b2475f","Type":"ContainerStarted","Data":"1b96ac68b54ab3de70d2611b1c4f0143be5540343684c80dcbfa6cb93352ac8b"} Apr 17 07:54:07.292669 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:07.292618 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kbg5v" podStartSLOduration=1.450027559 podStartE2EDuration="2.292601614s" podCreationTimestamp="2026-04-17 07:54:05 +0000 UTC" firstStartedPulling="2026-04-17 07:54:05.832494353 +0000 UTC m=+139.542097848" lastFinishedPulling="2026-04-17 07:54:06.675068412 +0000 UTC m=+140.384671903" observedRunningTime="2026-04-17 07:54:07.29131359 +0000 UTC m=+141.000917102" watchObservedRunningTime="2026-04-17 07:54:07.292601614 +0000 UTC m=+141.002205128" Apr 17 07:54:08.248118 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.248085 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ttkbm"] Apr 17 07:54:08.250961 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.250946 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.252975 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.252954 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 07:54:08.253623 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.253602 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-z7wtq\"" Apr 17 07:54:08.253702 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.253627 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 07:54:08.253702 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.253604 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 07:54:08.253702 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.253665 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 07:54:08.259564 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.259543 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ttkbm"] Apr 17 07:54:08.360987 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.360930 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef1517c9-4d6f-40d9-8133-78902a77304e-signing-key\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.360987 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.360991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4hh\" (UniqueName: \"kubernetes.io/projected/ef1517c9-4d6f-40d9-8133-78902a77304e-kube-api-access-kc4hh\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.361423 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.361061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef1517c9-4d6f-40d9-8133-78902a77304e-signing-cabundle\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.462403 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.462356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef1517c9-4d6f-40d9-8133-78902a77304e-signing-key\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.462403 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.462411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4hh\" (UniqueName: \"kubernetes.io/projected/ef1517c9-4d6f-40d9-8133-78902a77304e-kube-api-access-kc4hh\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.462596 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.462525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef1517c9-4d6f-40d9-8133-78902a77304e-signing-cabundle\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.463147 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.463129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef1517c9-4d6f-40d9-8133-78902a77304e-signing-cabundle\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.464917 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.464878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef1517c9-4d6f-40d9-8133-78902a77304e-signing-key\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.470770 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.470747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4hh\" (UniqueName: \"kubernetes.io/projected/ef1517c9-4d6f-40d9-8133-78902a77304e-kube-api-access-kc4hh\") pod \"service-ca-865cb79987-ttkbm\" (UID: \"ef1517c9-4d6f-40d9-8133-78902a77304e\") " pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.560430 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.560345 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ttkbm" Apr 17 07:54:08.646150 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.646070 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mvk82_c3da12b2-c527-43e5-96c7-37b56fb6b22d/node-ca/0.log" Apr 17 07:54:08.676195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.676164 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ttkbm"] Apr 17 07:54:08.679962 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:08.679935 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef1517c9_4d6f_40d9_8133_78902a77304e.slice/crio-e252c59e11bcbd990e25147e29e11588b41299b14af111b47fa41f936d55530b WatchSource:0}: Error finding container e252c59e11bcbd990e25147e29e11588b41299b14af111b47fa41f936d55530b: Status 404 returned error can't find the container with id e252c59e11bcbd990e25147e29e11588b41299b14af111b47fa41f936d55530b Apr 17 07:54:08.967689 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:08.967657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:08.967859 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:08.967769 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:08.967859 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:08.967782 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558bd67658-kz8wz: secret "image-registry-tls" not found Apr 17 07:54:08.967859 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:08.967836 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls podName:d6a1cde0-f02d-40ca-91b3-342b21f46121 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:16.967822032 +0000 UTC m=+150.677425524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls") pod "image-registry-558bd67658-kz8wz" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121") : secret "image-registry-tls" not found Apr 17 07:54:09.068625 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:09.068589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:09.068795 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:09.068735 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:09.068850 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:09.068801 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls podName:81317276-c9cb-47e7-a1e6-ff72bed6bfc7 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:17.068781133 +0000 UTC m=+150.778384627 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-plmjw" (UID: "81317276-c9cb-47e7-a1e6-ff72bed6bfc7") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:09.282950 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:09.282847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ttkbm" event={"ID":"ef1517c9-4d6f-40d9-8133-78902a77304e","Type":"ContainerStarted","Data":"e252c59e11bcbd990e25147e29e11588b41299b14af111b47fa41f936d55530b"} Apr 17 07:54:10.286735 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:10.286697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ttkbm" event={"ID":"ef1517c9-4d6f-40d9-8133-78902a77304e","Type":"ContainerStarted","Data":"72b94142636374cce20e13ddb6ad4c511bf21bd1b7eb844dc688770b9a9ee7db"} Apr 17 07:54:10.301799 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:10.301756 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-ttkbm" podStartSLOduration=0.778726002 podStartE2EDuration="2.301739597s" podCreationTimestamp="2026-04-17 07:54:08 +0000 UTC" firstStartedPulling="2026-04-17 07:54:08.681722954 +0000 UTC m=+142.391326445" lastFinishedPulling="2026-04-17 07:54:10.204736543 +0000 UTC m=+143.914340040" observedRunningTime="2026-04-17 07:54:10.301573629 +0000 UTC m=+144.011177142" watchObservedRunningTime="2026-04-17 07:54:10.301739597 +0000 UTC m=+144.011343109" Apr 17 07:54:11.640726 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:11.640692 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:11.640726 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:11.640733 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:11.641260 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:11.641211 2579 scope.go:117] "RemoveContainer" containerID="40a337629a2a48c6f254ce93cd7db445fc72fa033660343d292a9752f1c0038a" Apr 17 07:54:11.641448 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:11.641423 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7wts4_openshift-console-operator(e3b4cab1-15d3-4640-89c6-ec91e734f2fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" podUID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" Apr 17 07:54:17.029961 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:17.029924 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:17.032343 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:17.032316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"image-registry-558bd67658-kz8wz\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:17.131253 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:17.131193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:17.131398 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:17.131336 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:17.131454 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:17.131405 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls podName:81317276-c9cb-47e7-a1e6-ff72bed6bfc7 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:33.131389116 +0000 UTC m=+166.840992607 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-plmjw" (UID: "81317276-c9cb-47e7-a1e6-ff72bed6bfc7") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:17.133157 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:17.133144 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:17.254712 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:17.254679 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-558bd67658-kz8wz"] Apr 17 07:54:17.258455 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:17.258429 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a1cde0_f02d_40ca_91b3_342b21f46121.slice/crio-168490dc687a01c78476d3a7134ccf4c69515a09a1549511d1d44ebb1c2db46c WatchSource:0}: Error finding container 168490dc687a01c78476d3a7134ccf4c69515a09a1549511d1d44ebb1c2db46c: Status 404 returned error can't find the container with id 168490dc687a01c78476d3a7134ccf4c69515a09a1549511d1d44ebb1c2db46c Apr 17 07:54:17.304901 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:17.304842 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" event={"ID":"d6a1cde0-f02d-40ca-91b3-342b21f46121","Type":"ContainerStarted","Data":"168490dc687a01c78476d3a7134ccf4c69515a09a1549511d1d44ebb1c2db46c"} Apr 17 07:54:18.308403 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:18.308367 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" event={"ID":"d6a1cde0-f02d-40ca-91b3-342b21f46121","Type":"ContainerStarted","Data":"dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb"} Apr 17 07:54:18.308800 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:18.308516 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:18.332030 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:18.331978 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" podStartSLOduration=17.331962006 podStartE2EDuration="17.331962006s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:18.331774991 +0000 UTC m=+152.041378505" watchObservedRunningTime="2026-04-17 07:54:18.331962006 +0000 UTC m=+152.041565516" Apr 17 07:54:22.149692 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:22.149652 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lrw8v" podUID="61a4efa4-9ac5-47c3-ba8a-6fa191936c56" Apr 17 07:54:22.154739 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:22.154718 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bh4vm" podUID="d4301eb7-7ba7-42b5-961e-ef10f1fe7955" Apr 17 07:54:22.317197 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:22.317166 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:54:22.317383 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:22.317166 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lrw8v" Apr 17 07:54:22.895069 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:22.895033 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qq9zp" podUID="2b638776-5000-47ee-92c9-8ba7655f560c" Apr 17 07:54:23.877779 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:23.877726 2579 scope.go:117] "RemoveContainer" containerID="40a337629a2a48c6f254ce93cd7db445fc72fa033660343d292a9752f1c0038a" Apr 17 07:54:24.324953 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:24.324855 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 07:54:24.325291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:24.325274 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/1.log" Apr 17 07:54:24.325343 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:24.325309 2579 generic.go:358] "Generic (PLEG): container finished" podID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" containerID="6c07e074e0003a1d17e5324a336b07c09b0f070df073b0613e4da14e6f212f2b" exitCode=255 Apr 17 07:54:24.325441 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:24.325344 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" event={"ID":"e3b4cab1-15d3-4640-89c6-ec91e734f2fd","Type":"ContainerDied","Data":"6c07e074e0003a1d17e5324a336b07c09b0f070df073b0613e4da14e6f212f2b"} Apr 17 07:54:24.325441 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:24.325372 2579 scope.go:117] "RemoveContainer" containerID="40a337629a2a48c6f254ce93cd7db445fc72fa033660343d292a9752f1c0038a" Apr 17 07:54:24.325733 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:24.325715 2579 scope.go:117] "RemoveContainer" containerID="6c07e074e0003a1d17e5324a336b07c09b0f070df073b0613e4da14e6f212f2b" Apr 17 07:54:24.325945 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:24.325914 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-7wts4_openshift-console-operator(e3b4cab1-15d3-4640-89c6-ec91e734f2fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" podUID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" Apr 17 07:54:25.328356 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:25.328323 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 07:54:27.203601 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.203563 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:54:27.203988 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.203621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:54:27.206155 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.206129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a4efa4-9ac5-47c3-ba8a-6fa191936c56-metrics-tls\") pod \"dns-default-lrw8v\" (UID: \"61a4efa4-9ac5-47c3-ba8a-6fa191936c56\") " pod="openshift-dns/dns-default-lrw8v" Apr 17 07:54:27.206252 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.206175 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4301eb7-7ba7-42b5-961e-ef10f1fe7955-cert\") pod \"ingress-canary-bh4vm\" (UID: \"d4301eb7-7ba7-42b5-961e-ef10f1fe7955\") " pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:54:27.420084 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.420054 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tgj8b\"" Apr 17 07:54:27.420607 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.420590 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l9k4x\"" Apr 17 07:54:27.429010 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.428991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lrw8v" Apr 17 07:54:27.429106 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.429067 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bh4vm" Apr 17 07:54:27.551200 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.551009 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bh4vm"] Apr 17 07:54:27.553822 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:27.553793 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4301eb7_7ba7_42b5_961e_ef10f1fe7955.slice/crio-8aec03539e4de4c7c7413474b9e1fb30634ce58bfbb8fa7e25006efba56b2498 WatchSource:0}: Error finding container 8aec03539e4de4c7c7413474b9e1fb30634ce58bfbb8fa7e25006efba56b2498: Status 404 returned error can't find the container with id 8aec03539e4de4c7c7413474b9e1fb30634ce58bfbb8fa7e25006efba56b2498 Apr 17 07:54:27.568135 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:27.568113 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lrw8v"] Apr 17 07:54:27.570923 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:27.570897 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61a4efa4_9ac5_47c3_ba8a_6fa191936c56.slice/crio-56fb8d87d65249e754c89e22817877547a2c4fef2f93647f414dadd8ad5f51ad WatchSource:0}: Error finding container 56fb8d87d65249e754c89e22817877547a2c4fef2f93647f414dadd8ad5f51ad: Status 404 returned error can't find the container with id 56fb8d87d65249e754c89e22817877547a2c4fef2f93647f414dadd8ad5f51ad Apr 17 07:54:28.337049 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:28.337007 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lrw8v" event={"ID":"61a4efa4-9ac5-47c3-ba8a-6fa191936c56","Type":"ContainerStarted","Data":"56fb8d87d65249e754c89e22817877547a2c4fef2f93647f414dadd8ad5f51ad"} Apr 17 07:54:28.338333 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:28.338305 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bh4vm" event={"ID":"d4301eb7-7ba7-42b5-961e-ef10f1fe7955","Type":"ContainerStarted","Data":"8aec03539e4de4c7c7413474b9e1fb30634ce58bfbb8fa7e25006efba56b2498"} Apr 17 07:54:30.346487 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.346407 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lrw8v" event={"ID":"61a4efa4-9ac5-47c3-ba8a-6fa191936c56","Type":"ContainerStarted","Data":"f3a4f823df20518b082e596b050865338a62f16b446d59da71f6643e18947532"} Apr 17 07:54:30.346487 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.346444 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lrw8v" event={"ID":"61a4efa4-9ac5-47c3-ba8a-6fa191936c56","Type":"ContainerStarted","Data":"ca5c10f85a77db944e2027a9bcf1413ffdfd2a0deace3533163bec25bd9fe1a8"} Apr 17 07:54:30.346487 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.346483 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lrw8v" Apr 17 07:54:30.347651 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.347631 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bh4vm" event={"ID":"d4301eb7-7ba7-42b5-961e-ef10f1fe7955","Type":"ContainerStarted","Data":"494ee602717e5f8a34f4759dd78bb24af6c1790f9af7109b49d4cc7e382ec2b6"} Apr 17 07:54:30.362715 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.362675 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lrw8v" podStartSLOduration=129.518011233 podStartE2EDuration="2m11.362664001s" podCreationTimestamp="2026-04-17 07:52:19 +0000 UTC" firstStartedPulling="2026-04-17 07:54:27.572575406 +0000 UTC m=+161.282178900" lastFinishedPulling="2026-04-17 07:54:29.417228175 +0000 UTC m=+163.126831668" observedRunningTime="2026-04-17 07:54:30.361675704 +0000 UTC m=+164.071279228" watchObservedRunningTime="2026-04-17 07:54:30.362664001 +0000 UTC m=+164.072267527" Apr 17 07:54:30.375281 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.375244 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bh4vm" podStartSLOduration=129.510285551 podStartE2EDuration="2m11.375233982s" podCreationTimestamp="2026-04-17 07:52:19 +0000 UTC" firstStartedPulling="2026-04-17 07:54:27.555747289 +0000 UTC m=+161.265350782" lastFinishedPulling="2026-04-17 07:54:29.420695721 +0000 UTC m=+163.130299213" observedRunningTime="2026-04-17 07:54:30.37432836 +0000 UTC m=+164.083931886" watchObservedRunningTime="2026-04-17 07:54:30.375233982 +0000 UTC m=+164.084837528" Apr 17 07:54:30.759642 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.759607 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-558bd67658-kz8wz"] Apr 17 07:54:30.871304 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.871271 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-z8d29"] Apr 17 07:54:30.874408 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.874386 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:30.876981 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.876959 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:54:30.876981 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.876970 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:54:30.877295 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.877278 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:54:30.877506 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.877494 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:54:30.877682 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.877670 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-m2z7h\"" Apr 17 07:54:30.890968 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:30.890946 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z8d29"] Apr 17 07:54:31.032036 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.031931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.032036 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.031984 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-crio-socket\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.032227 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.032043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zf2k\" (UniqueName: \"kubernetes.io/projected/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-kube-api-access-9zf2k\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.032227 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.032121 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.032227 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.032157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-data-volume\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.132726 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.132690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zf2k\" (UniqueName: \"kubernetes.io/projected/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-kube-api-access-9zf2k\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.132726 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.132730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.132966 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.132760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-data-volume\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.132966 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.132832 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.132966 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.132872 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-crio-socket\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.133093 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.133025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-crio-socket\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.133278 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.133238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-data-volume\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.133455 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.133439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.135323 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.135302 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.155805 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.155781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zf2k\" (UniqueName: \"kubernetes.io/projected/7cacbfd9-e67b-4bff-ad81-e39e7850f0b0-kube-api-access-9zf2k\") pod \"insights-runtime-extractor-z8d29\" (UID: \"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0\") " pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.184930 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.184905 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z8d29" Apr 17 07:54:31.305484 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.305409 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z8d29"] Apr 17 07:54:31.308768 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:31.308741 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cacbfd9_e67b_4bff_ad81_e39e7850f0b0.slice/crio-f0d6b3ca1e128fe45a31632d54124b942d76adfc48da3002d0457fc13d56113c WatchSource:0}: Error finding container f0d6b3ca1e128fe45a31632d54124b942d76adfc48da3002d0457fc13d56113c: Status 404 returned error can't find the container with id f0d6b3ca1e128fe45a31632d54124b942d76adfc48da3002d0457fc13d56113c Apr 17 07:54:31.351860 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.351834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z8d29" event={"ID":"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0","Type":"ContainerStarted","Data":"f0d6b3ca1e128fe45a31632d54124b942d76adfc48da3002d0457fc13d56113c"} Apr 17 07:54:31.640801 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.640768 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:31.640997 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.640809 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:31.641231 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:31.641215 2579 scope.go:117] "RemoveContainer" containerID="6c07e074e0003a1d17e5324a336b07c09b0f070df073b0613e4da14e6f212f2b" Apr 17 07:54:31.641423 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:31.641404 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-7wts4_openshift-console-operator(e3b4cab1-15d3-4640-89c6-ec91e734f2fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" podUID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" Apr 17 07:54:32.359772 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:32.359689 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z8d29" event={"ID":"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0","Type":"ContainerStarted","Data":"b8ffe983cf192ed47733cd53789ed3f08b4b70c726215049ef0500b2c26c1957"} Apr 17 07:54:32.359772 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:32.359735 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z8d29" event={"ID":"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0","Type":"ContainerStarted","Data":"3d0f0247cc203aedc803bf7727e213698af9ef3f0b0d26090842f9a1c8c58770"} Apr 17 07:54:33.147297 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:33.147247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:33.149903 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:33.149862 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/81317276-c9cb-47e7-a1e6-ff72bed6bfc7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-plmjw\" (UID: \"81317276-c9cb-47e7-a1e6-ff72bed6bfc7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:33.364573 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:33.364540 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z8d29" event={"ID":"7cacbfd9-e67b-4bff-ad81-e39e7850f0b0","Type":"ContainerStarted","Data":"7716ecd4473c10b37beda4ba90797f74d5061fabda52b7fb898dd1470e075eb3"} Apr 17 07:54:33.381961 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:33.381924 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-z8d29" podStartSLOduration=1.533955585 podStartE2EDuration="3.381911313s" podCreationTimestamp="2026-04-17 07:54:30 +0000 UTC" firstStartedPulling="2026-04-17 07:54:31.379583029 +0000 UTC m=+165.089186519" lastFinishedPulling="2026-04-17 07:54:33.227538752 +0000 UTC m=+166.937142247" observedRunningTime="2026-04-17 07:54:33.380505975 +0000 UTC m=+167.090109487" watchObservedRunningTime="2026-04-17 07:54:33.381911313 +0000 UTC m=+167.091514823" Apr 17 07:54:33.430077 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:33.430024 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" Apr 17 07:54:33.547513 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:33.547490 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw"] Apr 17 07:54:33.549939 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:33.549910 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81317276_c9cb_47e7_a1e6_ff72bed6bfc7.slice/crio-ea57f1ed35641634df903757001a62d6baae0a8a8d9d0c46db27b78bd5a6edc8 WatchSource:0}: Error finding container ea57f1ed35641634df903757001a62d6baae0a8a8d9d0c46db27b78bd5a6edc8: Status 404 returned error can't find the container with id ea57f1ed35641634df903757001a62d6baae0a8a8d9d0c46db27b78bd5a6edc8 Apr 17 07:54:34.369385 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:34.369341 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" event={"ID":"81317276-c9cb-47e7-a1e6-ff72bed6bfc7","Type":"ContainerStarted","Data":"ea57f1ed35641634df903757001a62d6baae0a8a8d9d0c46db27b78bd5a6edc8"} Apr 17 07:54:35.373226 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:35.373140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" event={"ID":"81317276-c9cb-47e7-a1e6-ff72bed6bfc7","Type":"ContainerStarted","Data":"a33d619f3faad5e454a778b9e6389b55ea4d79f1161046d775a9894c230865bd"} Apr 17 07:54:35.392426 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:35.392385 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-plmjw" podStartSLOduration=33.020280458 podStartE2EDuration="34.392372828s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:33.551669632 +0000 UTC m=+167.261273126" lastFinishedPulling="2026-04-17 07:54:34.923762001 +0000 UTC m=+168.633365496" observedRunningTime="2026-04-17 07:54:35.391580577 +0000 UTC m=+169.101184090" watchObservedRunningTime="2026-04-17 07:54:35.392372828 +0000 UTC m=+169.101976401" Apr 17 07:54:36.879924 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:36.879683 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:54:38.525552 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.525513 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-nhk85"] Apr 17 07:54:38.528665 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.528649 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.531429 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.531402 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 07:54:38.531429 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.531413 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-r296j\"" Apr 17 07:54:38.531624 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.531473 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:54:38.531624 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.531410 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 07:54:38.538810 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.538788 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-nhk85"] Apr 17 07:54:38.584039 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.584009 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46kvg\" (UniqueName: \"kubernetes.io/projected/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-kube-api-access-46kvg\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.584039 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.584043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.584247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.584060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.584247 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.584139 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.684976 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.684933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46kvg\" (UniqueName: \"kubernetes.io/projected/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-kube-api-access-46kvg\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.685147 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.684983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.685147 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.685009 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.685147 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.685072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.685763 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.685741 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.687644 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.687613 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.687731 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.687705 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.692843 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.692818 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46kvg\" (UniqueName: \"kubernetes.io/projected/4e851222-e25b-4bd4-a0e0-f16d686a1ad2-kube-api-access-46kvg\") pod \"prometheus-operator-5676c8c784-nhk85\" (UID: \"4e851222-e25b-4bd4-a0e0-f16d686a1ad2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.838036 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.837929 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" Apr 17 07:54:38.955092 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:38.955037 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-nhk85"] Apr 17 07:54:38.959293 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:38.959266 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e851222_e25b_4bd4_a0e0_f16d686a1ad2.slice/crio-02a8b7e96d492162e0cabcb9bbb8601eb750a9da86b46b3ab84fa3b1b819b374 WatchSource:0}: Error finding container 02a8b7e96d492162e0cabcb9bbb8601eb750a9da86b46b3ab84fa3b1b819b374: Status 404 returned error can't find the container with id 02a8b7e96d492162e0cabcb9bbb8601eb750a9da86b46b3ab84fa3b1b819b374 Apr 17 07:54:39.383464 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:39.383427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" event={"ID":"4e851222-e25b-4bd4-a0e0-f16d686a1ad2","Type":"ContainerStarted","Data":"02a8b7e96d492162e0cabcb9bbb8601eb750a9da86b46b3ab84fa3b1b819b374"} Apr 17 07:54:40.355666 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:40.355628 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lrw8v" Apr 17 07:54:40.387506 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:40.387423 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" event={"ID":"4e851222-e25b-4bd4-a0e0-f16d686a1ad2","Type":"ContainerStarted","Data":"903de6f1ac4fd12282be60117fe085ae0276424da7e1a88678516146dc961744"} Apr 17 07:54:40.387506 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:40.387466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" event={"ID":"4e851222-e25b-4bd4-a0e0-f16d686a1ad2","Type":"ContainerStarted","Data":"6b64d98fb3f586de40f21e175acf58219a6ae70563b155958ee89dda1b726e02"} Apr 17 07:54:40.404600 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:40.404550 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-nhk85" podStartSLOduration=1.230421236 podStartE2EDuration="2.404533397s" podCreationTimestamp="2026-04-17 07:54:38 +0000 UTC" firstStartedPulling="2026-04-17 07:54:38.961682881 +0000 UTC m=+172.671286372" lastFinishedPulling="2026-04-17 07:54:40.135795027 +0000 UTC m=+173.845398533" observedRunningTime="2026-04-17 07:54:40.404458436 +0000 UTC m=+174.114061951" watchObservedRunningTime="2026-04-17 07:54:40.404533397 +0000 UTC m=+174.114136911" Apr 17 07:54:40.764944 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:40.764849 2579 patch_prober.go:28] interesting pod/image-registry-558bd67658-kz8wz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:40.765072 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:40.764918 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" podUID="d6a1cde0-f02d-40ca-91b3-342b21f46121" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:41.910733 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.910697 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj"] Apr 17 07:54:41.913952 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.913936 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:41.915995 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.915968 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 07:54:41.916305 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.916290 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 07:54:41.916577 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.916563 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8bbhf\"" Apr 17 07:54:41.926503 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.926483 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj"] Apr 17 07:54:41.929422 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.929403 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hlkqr"] Apr 17 07:54:41.932678 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.932654 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:41.934865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.934842 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:54:41.935372 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.935354 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-68tcw\"" Apr 17 07:54:41.935555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.935383 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:54:41.935658 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:41.935426 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:54:42.010343 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-sys\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.010343 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010344 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frt6z\" (UniqueName: \"kubernetes.io/projected/827d2ca3-e0b2-47a8-b2e2-124564260c25-kube-api-access-frt6z\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.010565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-root\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.010565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010412 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-wtmp\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.010565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010502 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-accelerators-collector-config\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.010565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.010565 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.011011 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-tls\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.011011 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/889bd38b-dd72-4df0-bede-0c93109664ba-metrics-client-ca\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.011011 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-textfile\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.011011 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/827d2ca3-e0b2-47a8-b2e2-124564260c25-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.011011 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.011011 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.010763 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj4nj\" (UniqueName: \"kubernetes.io/projected/889bd38b-dd72-4df0-bede-0c93109664ba-kube-api-access-bj4nj\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111217 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111175 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-sys\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111217 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frt6z\" (UniqueName: \"kubernetes.io/projected/827d2ca3-e0b2-47a8-b2e2-124564260c25-kube-api-access-frt6z\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-root\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-wtmp\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-accelerators-collector-config\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-sys\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-tls\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/889bd38b-dd72-4df0-bede-0c93109664ba-metrics-client-ca\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111449 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-textfile\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111850 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/827d2ca3-e0b2-47a8-b2e2-124564260c25-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.111850 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111467 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-wtmp\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111850 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.111850 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111524 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/889bd38b-dd72-4df0-bede-0c93109664ba-root\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111850 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.111561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj4nj\" (UniqueName: \"kubernetes.io/projected/889bd38b-dd72-4df0-bede-0c93109664ba-kube-api-access-bj4nj\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.111850 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:42.111608 2579 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 07:54:42.112172 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.112108 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/889bd38b-dd72-4df0-bede-0c93109664ba-metrics-client-ca\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.112172 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.112114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-accelerators-collector-config\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.112172 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:42.112138 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-tls podName:827d2ca3-e0b2-47a8-b2e2-124564260c25 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:42.612116817 +0000 UTC m=+176.321720311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-zgmwj" (UID: "827d2ca3-e0b2-47a8-b2e2-124564260c25") : secret "openshift-state-metrics-tls" not found Apr 17 07:54:42.112172 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.112150 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-textfile\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.113683 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.113651 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/827d2ca3-e0b2-47a8-b2e2-124564260c25-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.114530 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.114502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-tls\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.114611 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.114532 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/889bd38b-dd72-4df0-bede-0c93109664ba-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.114648 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.114606 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.123728 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.123696 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frt6z\" (UniqueName: \"kubernetes.io/projected/827d2ca3-e0b2-47a8-b2e2-124564260c25-kube-api-access-frt6z\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.123944 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.123926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj4nj\" (UniqueName: \"kubernetes.io/projected/889bd38b-dd72-4df0-bede-0c93109664ba-kube-api-access-bj4nj\") pod \"node-exporter-hlkqr\" (UID: \"889bd38b-dd72-4df0-bede-0c93109664ba\") " pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.244087 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.244012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hlkqr" Apr 17 07:54:42.251726 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:42.251700 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod889bd38b_dd72_4df0_bede_0c93109664ba.slice/crio-5f35f1fab65a83529461f38f8d91ff913f411265eb4cb76eb290d48920f73115 WatchSource:0}: Error finding container 5f35f1fab65a83529461f38f8d91ff913f411265eb4cb76eb290d48920f73115: Status 404 returned error can't find the container with id 5f35f1fab65a83529461f38f8d91ff913f411265eb4cb76eb290d48920f73115 Apr 17 07:54:42.395373 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.395335 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hlkqr" event={"ID":"889bd38b-dd72-4df0-bede-0c93109664ba","Type":"ContainerStarted","Data":"5f35f1fab65a83529461f38f8d91ff913f411265eb4cb76eb290d48920f73115"} Apr 17 07:54:42.615946 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.615848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.618324 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.618304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/827d2ca3-e0b2-47a8-b2e2-124564260c25-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zgmwj\" (UID: \"827d2ca3-e0b2-47a8-b2e2-124564260c25\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.825059 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.825028 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" Apr 17 07:54:42.976867 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:42.976841 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj"] Apr 17 07:54:43.400088 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.400049 2579 generic.go:358] "Generic (PLEG): container finished" podID="889bd38b-dd72-4df0-bede-0c93109664ba" containerID="61ed5451ae491abff1adaf9e231f08af43c2864d60577ee230b85c455d19a34e" exitCode=0 Apr 17 07:54:43.400245 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.400145 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hlkqr" event={"ID":"889bd38b-dd72-4df0-bede-0c93109664ba","Type":"ContainerDied","Data":"61ed5451ae491abff1adaf9e231f08af43c2864d60577ee230b85c455d19a34e"} Apr 17 07:54:43.401814 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.401794 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" event={"ID":"827d2ca3-e0b2-47a8-b2e2-124564260c25","Type":"ContainerStarted","Data":"c231bb4d5c328ebf04a63047971151239243668ac1d03a636f9459cd283c63c7"} Apr 17 07:54:43.401968 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.401818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" event={"ID":"827d2ca3-e0b2-47a8-b2e2-124564260c25","Type":"ContainerStarted","Data":"af61788c004ca8b778351343beb5422fd2fc33b86b1c0a77af610198c1ba64f7"} Apr 17 07:54:43.401968 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.401837 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" event={"ID":"827d2ca3-e0b2-47a8-b2e2-124564260c25","Type":"ContainerStarted","Data":"78559667cddbe317fa3bf7dbf2ef4610855201875badf139b35c78591a42aeb2"} Apr 17 07:54:43.929904 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.929848 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-754887df57-tjggd"] Apr 17 07:54:43.934182 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.934155 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:43.936797 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.936767 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 07:54:43.936944 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.936830 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 07:54:43.936944 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.936921 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 07:54:43.936944 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.936931 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-tpkhx\"" Apr 17 07:54:43.937118 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.937018 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 07:54:43.937216 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.937197 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 07:54:43.937285 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.937228 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-9l40mlslmqn79\"" Apr 17 07:54:43.948964 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:43.948937 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-754887df57-tjggd"] Apr 17 07:54:44.028480 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.028436 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvsg5\" (UniqueName: \"kubernetes.io/projected/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-kube-api-access-rvsg5\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.028865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.028488 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.028865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.028515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-grpc-tls\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.028865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.028547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.028865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.028575 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-metrics-client-ca\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.028865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.028623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-tls\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.028865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.028698 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.028865 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.028751 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.130035 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.129991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvsg5\" (UniqueName: \"kubernetes.io/projected/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-kube-api-access-rvsg5\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.130035 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.130031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.130269 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.130161 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-grpc-tls\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.130269 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.130204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.130269 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.130236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-metrics-client-ca\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.130269 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.130262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-tls\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.130440 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.130285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.130440 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.130318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.131390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.131334 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-metrics-client-ca\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.133121 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.133089 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.133324 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.133257 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.133455 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.133414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-grpc-tls\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.133609 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.133581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.133692 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.133668 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.134431 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.134412 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-secret-thanos-querier-tls\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.138203 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.138179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvsg5\" (UniqueName: \"kubernetes.io/projected/80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4-kube-api-access-rvsg5\") pod \"thanos-querier-754887df57-tjggd\" (UID: \"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4\") " pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.244760 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.244665 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:44.385221 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.385186 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-754887df57-tjggd"] Apr 17 07:54:44.391393 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:44.391364 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c29f9b_89fa_41d1_b4a2_4f1dfcf38af4.slice/crio-99f8ee3657891407d35cf0eefceea2d231e808c5b4203bccd98533e912526375 WatchSource:0}: Error finding container 99f8ee3657891407d35cf0eefceea2d231e808c5b4203bccd98533e912526375: Status 404 returned error can't find the container with id 99f8ee3657891407d35cf0eefceea2d231e808c5b4203bccd98533e912526375 Apr 17 07:54:44.405999 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.405965 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" event={"ID":"827d2ca3-e0b2-47a8-b2e2-124564260c25","Type":"ContainerStarted","Data":"f7f9574210e107dc66511fd51d19530614f7b25866eeeb9da947863799a43aca"} Apr 17 07:54:44.407142 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.407105 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" event={"ID":"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4","Type":"ContainerStarted","Data":"99f8ee3657891407d35cf0eefceea2d231e808c5b4203bccd98533e912526375"} Apr 17 07:54:44.409095 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.409075 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hlkqr" event={"ID":"889bd38b-dd72-4df0-bede-0c93109664ba","Type":"ContainerStarted","Data":"fb11e05ab3205eff756fcaac7b66031f103ce16edacfc0da9201c21a2c119a57"} Apr 17 07:54:44.409195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.409098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hlkqr" event={"ID":"889bd38b-dd72-4df0-bede-0c93109664ba","Type":"ContainerStarted","Data":"0718e4272ec56009590106c2da1fe6e931b54f62e181c701a80c2ef51d628a9e"} Apr 17 07:54:44.439080 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.439030 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zgmwj" podStartSLOduration=2.582119564 podStartE2EDuration="3.439015778s" podCreationTimestamp="2026-04-17 07:54:41 +0000 UTC" firstStartedPulling="2026-04-17 07:54:43.207847695 +0000 UTC m=+176.917451185" lastFinishedPulling="2026-04-17 07:54:44.064743894 +0000 UTC m=+177.774347399" observedRunningTime="2026-04-17 07:54:44.437388724 +0000 UTC m=+178.146992239" watchObservedRunningTime="2026-04-17 07:54:44.439015778 +0000 UTC m=+178.148619291" Apr 17 07:54:44.459408 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:44.459360 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hlkqr" podStartSLOduration=2.607619048 podStartE2EDuration="3.459344061s" podCreationTimestamp="2026-04-17 07:54:41 +0000 UTC" firstStartedPulling="2026-04-17 07:54:42.258322801 +0000 UTC m=+175.967926307" lastFinishedPulling="2026-04-17 07:54:43.110047827 +0000 UTC m=+176.819651320" observedRunningTime="2026-04-17 07:54:44.458556199 +0000 UTC m=+178.168159737" watchObservedRunningTime="2026-04-17 07:54:44.459344061 +0000 UTC m=+178.168947574" Apr 17 07:54:45.877573 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:45.877537 2579 scope.go:117] "RemoveContainer" containerID="6c07e074e0003a1d17e5324a336b07c09b0f070df073b0613e4da14e6f212f2b" Apr 17 07:54:46.416940 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:46.416838 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 07:54:46.417106 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:46.416960 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" event={"ID":"e3b4cab1-15d3-4640-89c6-ec91e734f2fd","Type":"ContainerStarted","Data":"f21655238d5aac017287885f6acb256fe458c9b72c5b72b2b0ff8e3685809f8a"} Apr 17 07:54:46.417300 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:46.417277 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:46.419085 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:46.419064 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" event={"ID":"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4","Type":"ContainerStarted","Data":"719bef6559b5d79295a7c74c0f624e4c3a5697117d38ae9f9d8c3b72d4f9b515"} Apr 17 07:54:46.419186 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:46.419090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" event={"ID":"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4","Type":"ContainerStarted","Data":"6f994b69570b6a0c9e3b8ff0084b529cd6b02f5a3b0b68ec0728adf10a30af18"} Apr 17 07:54:46.419186 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:46.419101 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" event={"ID":"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4","Type":"ContainerStarted","Data":"ec82ed3262f1f62a31140e922dea4b3272e98be7a0098638feb8ff79648b18e7"} Apr 17 07:54:46.433808 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:46.433771 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" podStartSLOduration=43.809969539 podStartE2EDuration="45.433758243s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:01.761977364 +0000 UTC m=+135.471580855" lastFinishedPulling="2026-04-17 07:54:03.385766065 +0000 UTC m=+137.095369559" observedRunningTime="2026-04-17 07:54:46.43272348 +0000 UTC m=+180.142326993" watchObservedRunningTime="2026-04-17 07:54:46.433758243 +0000 UTC m=+180.143361778" Apr 17 07:54:47.129840 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.129811 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-579c77c998-2zhv2"] Apr 17 07:54:47.134211 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.134193 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.137619 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.137281 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 07:54:47.137619 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.137325 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 07:54:47.137773 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.137754 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-p6849\"" Apr 17 07:54:47.137844 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.137828 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 07:54:47.141417 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.138615 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 07:54:47.149949 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.149233 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 07:54:47.150334 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.150296 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-579c77c998-2zhv2"] Apr 17 07:54:47.152169 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.152151 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 07:54:47.159571 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.159548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-telemeter-trusted-ca-bundle\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.159670 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.159591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-metrics-client-ca\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.159670 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.159643 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-secret-telemeter-client\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.159786 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.159709 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-federate-client-tls\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.159786 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.159745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-telemeter-client-tls\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.159930 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.159784 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-serving-certs-ca-bundle\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.159930 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.159876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.160048 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.159941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq28s\" (UniqueName: \"kubernetes.io/projected/3383c607-6fe7-4824-a6bf-297348c8d409-kube-api-access-bq28s\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.261454 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.261321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-metrics-client-ca\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.261454 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.261387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-secret-telemeter-client\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.261454 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.261436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-federate-client-tls\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.261640 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.261465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-telemeter-client-tls\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.261640 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.261500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-serving-certs-ca-bundle\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.261640 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.261570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.261640 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.261601 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq28s\" (UniqueName: \"kubernetes.io/projected/3383c607-6fe7-4824-a6bf-297348c8d409-kube-api-access-bq28s\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.261833 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.261642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-telemeter-trusted-ca-bundle\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.263235 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.262642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-telemeter-trusted-ca-bundle\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.263235 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.262857 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-serving-certs-ca-bundle\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.263235 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.263121 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3383c607-6fe7-4824-a6bf-297348c8d409-metrics-client-ca\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.267152 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.267129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-telemeter-client-tls\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.267728 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.267701 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.268032 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.268009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-federate-client-tls\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.269004 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.268978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3383c607-6fe7-4824-a6bf-297348c8d409-secret-telemeter-client\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.270795 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.270771 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq28s\" (UniqueName: \"kubernetes.io/projected/3383c607-6fe7-4824-a6bf-297348c8d409-kube-api-access-bq28s\") pod \"telemeter-client-579c77c998-2zhv2\" (UID: \"3383c607-6fe7-4824-a6bf-297348c8d409\") " pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.417338 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.417296 2579 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-7wts4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 17 07:54:47.417484 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.417371 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" podUID="e3b4cab1-15d3-4640-89c6-ec91e734f2fd" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 17 07:54:47.425542 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.425515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" event={"ID":"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4","Type":"ContainerStarted","Data":"52cd2fdba426c46ef6331149c7487dc556559b2c4cee66e52aa682e53ac5790e"} Apr 17 07:54:47.425696 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.425553 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" event={"ID":"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4","Type":"ContainerStarted","Data":"7d57a20ee5a8681adc9685fc78b76c38bc44a8ec58e7ba37557a1d5cbe50273d"} Apr 17 07:54:47.425696 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.425568 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" event={"ID":"80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4","Type":"ContainerStarted","Data":"1d8320a8210655b63a268833cebb7e9ea210c232677dd56f81e06b9e24cae1e7"} Apr 17 07:54:47.448942 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.448897 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" podStartSLOduration=1.761762548 podStartE2EDuration="4.448868532s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:44.393128724 +0000 UTC m=+178.102732215" lastFinishedPulling="2026-04-17 07:54:47.080234694 +0000 UTC m=+180.789838199" observedRunningTime="2026-04-17 07:54:47.447628972 +0000 UTC m=+181.157232496" watchObservedRunningTime="2026-04-17 07:54:47.448868532 +0000 UTC m=+181.158472047" Apr 17 07:54:47.457054 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.457030 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" Apr 17 07:54:47.489318 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.489207 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-7wts4" Apr 17 07:54:47.602453 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.602330 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-579c77c998-2zhv2"] Apr 17 07:54:47.604931 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:47.604904 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3383c607_6fe7_4824_a6bf_297348c8d409.slice/crio-6ec8b4a09b15585056cb018cf6616a917104a5aaa45cc869d678bd4901bcef33 WatchSource:0}: Error finding container 6ec8b4a09b15585056cb018cf6616a917104a5aaa45cc869d678bd4901bcef33: Status 404 returned error can't find the container with id 6ec8b4a09b15585056cb018cf6616a917104a5aaa45cc869d678bd4901bcef33 Apr 17 07:54:47.689039 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.689009 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-6d49j"] Apr 17 07:54:47.693445 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.693425 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6d49j" Apr 17 07:54:47.695668 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.695648 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:54:47.696089 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.696074 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:54:47.696563 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.696550 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-ljjbx\"" Apr 17 07:54:47.702690 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.702662 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6d49j"] Apr 17 07:54:47.767611 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.767580 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22sg\" (UniqueName: \"kubernetes.io/projected/7cd30c29-08d2-4de4-9362-0464cd619d23-kube-api-access-q22sg\") pod \"downloads-6bcc868b7-6d49j\" (UID: \"7cd30c29-08d2-4de4-9362-0464cd619d23\") " pod="openshift-console/downloads-6bcc868b7-6d49j" Apr 17 07:54:47.868567 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.868494 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q22sg\" (UniqueName: \"kubernetes.io/projected/7cd30c29-08d2-4de4-9362-0464cd619d23-kube-api-access-q22sg\") pod \"downloads-6bcc868b7-6d49j\" (UID: \"7cd30c29-08d2-4de4-9362-0464cd619d23\") " pod="openshift-console/downloads-6bcc868b7-6d49j" Apr 17 07:54:47.877408 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:47.877384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22sg\" (UniqueName: \"kubernetes.io/projected/7cd30c29-08d2-4de4-9362-0464cd619d23-kube-api-access-q22sg\") pod \"downloads-6bcc868b7-6d49j\" (UID: \"7cd30c29-08d2-4de4-9362-0464cd619d23\") " pod="openshift-console/downloads-6bcc868b7-6d49j" Apr 17 07:54:48.004044 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.004004 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6d49j" Apr 17 07:54:48.175994 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.175962 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6d49j"] Apr 17 07:54:48.180139 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:48.180110 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cd30c29_08d2_4de4_9362_0464cd619d23.slice/crio-5847043f49a1a0d890547cfc9889f10c1f6819015e906eb6a6a1544f5d1add00 WatchSource:0}: Error finding container 5847043f49a1a0d890547cfc9889f10c1f6819015e906eb6a6a1544f5d1add00: Status 404 returned error can't find the container with id 5847043f49a1a0d890547cfc9889f10c1f6819015e906eb6a6a1544f5d1add00 Apr 17 07:54:48.275479 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.275437 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:48.282620 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.282592 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.285121 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285092 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 07:54:48.285246 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285203 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9bxsp\"" Apr 17 07:54:48.285329 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285273 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-builu79s5mq0v\"" Apr 17 07:54:48.285545 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285526 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 07:54:48.285665 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285648 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 07:54:48.285722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285648 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 07:54:48.285722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285710 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 07:54:48.285820 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285722 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 07:54:48.285820 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.285658 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 07:54:48.286173 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.286156 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 07:54:48.286293 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.286273 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 07:54:48.288553 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.286816 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:54:48.290234 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.288801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 07:54:48.293029 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.292323 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 07:54:48.298630 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.298587 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 07:54:48.299179 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.299159 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:48.375388 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375326 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375388 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375382 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375693 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375409 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375693 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375455 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375693 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375495 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7557c\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-kube-api-access-7557c\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375693 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375617 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375693 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config-out\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375751 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375874 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375922 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.375980 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.376257 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.375991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.376257 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.376025 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.376257 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.376095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.376257 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.376157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-web-config\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.430148 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.430068 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6d49j" event={"ID":"7cd30c29-08d2-4de4-9362-0464cd619d23","Type":"ContainerStarted","Data":"5847043f49a1a0d890547cfc9889f10c1f6819015e906eb6a6a1544f5d1add00"} Apr 17 07:54:48.431266 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.431231 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" event={"ID":"3383c607-6fe7-4824-a6bf-297348c8d409","Type":"ContainerStarted","Data":"6ec8b4a09b15585056cb018cf6616a917104a5aaa45cc869d678bd4901bcef33"} Apr 17 07:54:48.432023 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.432001 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:48.477414 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477371 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.477414 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.477649 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.477649 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.477764 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.477764 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477682 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.477764 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.477931 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.477931 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478046 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477938 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-web-config\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478046 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.477993 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478046 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478199 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478060 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478199 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478199 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7557c\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-kube-api-access-7557c\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478344 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478344 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478344 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config-out\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.478344 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478311 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.479836 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.478679 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.479836 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.479527 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.479836 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.479621 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.481628 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.481034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.481628 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.481352 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.483705 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.483676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.484113 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.484088 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.484204 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.484174 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.484671 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.484651 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-web-config\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.484788 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.484731 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.484974 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.484953 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.485124 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.485106 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.485228 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.485206 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.485556 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.485526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config-out\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.485649 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.485612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.485748 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.485730 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.491954 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.491840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7557c\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-kube-api-access-7557c\") pod \"prometheus-k8s-0\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.599221 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.599184 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:48.777736 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:48.777702 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:48.783683 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:54:48.783653 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0064fe16_d7e9_4eb8_9fd6_d9a5cbcd91ff.slice/crio-e81bd8c166f2d2a90f80fcb1b3caaf5bbe320dcb7dc208cd7d04678f4aa01519 WatchSource:0}: Error finding container e81bd8c166f2d2a90f80fcb1b3caaf5bbe320dcb7dc208cd7d04678f4aa01519: Status 404 returned error can't find the container with id e81bd8c166f2d2a90f80fcb1b3caaf5bbe320dcb7dc208cd7d04678f4aa01519 Apr 17 07:54:49.437050 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:49.437005 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerStarted","Data":"e81bd8c166f2d2a90f80fcb1b3caaf5bbe320dcb7dc208cd7d04678f4aa01519"} Apr 17 07:54:50.443073 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:50.442980 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" event={"ID":"3383c607-6fe7-4824-a6bf-297348c8d409","Type":"ContainerStarted","Data":"82cf184c06962c4438432e38b45ac8fb1f9c25152cd993fb60d8fc1639a1a376"} Apr 17 07:54:50.443073 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:50.443022 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" event={"ID":"3383c607-6fe7-4824-a6bf-297348c8d409","Type":"ContainerStarted","Data":"172f8bd2293b501a31e675ce5c8d3558e84d3521d038a10b593b5ab7e2f5c2ea"} Apr 17 07:54:50.443073 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:50.443035 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" event={"ID":"3383c607-6fe7-4824-a6bf-297348c8d409","Type":"ContainerStarted","Data":"1dcc96db173482039dc4d112eefe1b7b0d7886be5051b45f80e02ea93f631e8b"} Apr 17 07:54:50.444509 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:50.444480 2579 generic.go:358] "Generic (PLEG): container finished" podID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" exitCode=0 Apr 17 07:54:50.444634 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:50.444566 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerDied","Data":"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a"} Apr 17 07:54:50.463667 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:50.463600 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-579c77c998-2zhv2" podStartSLOduration=0.880573343 podStartE2EDuration="3.463587815s" podCreationTimestamp="2026-04-17 07:54:47 +0000 UTC" firstStartedPulling="2026-04-17 07:54:47.606720678 +0000 UTC m=+181.316324169" lastFinishedPulling="2026-04-17 07:54:50.189735148 +0000 UTC m=+183.899338641" observedRunningTime="2026-04-17 07:54:50.463048162 +0000 UTC m=+184.172651678" watchObservedRunningTime="2026-04-17 07:54:50.463587815 +0000 UTC m=+184.173191327" Apr 17 07:54:50.765054 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:50.764977 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:54.444194 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:54.444166 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-754887df57-tjggd" Apr 17 07:54:54.461054 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:54.461019 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerStarted","Data":"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316"} Apr 17 07:54:54.461054 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:54.461058 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerStarted","Data":"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0"} Apr 17 07:54:54.461291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:54.461069 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerStarted","Data":"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e"} Apr 17 07:54:54.461291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:54.461079 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerStarted","Data":"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee"} Apr 17 07:54:54.461291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:54.461088 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerStarted","Data":"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268"} Apr 17 07:54:54.461291 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:54.461099 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerStarted","Data":"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab"} Apr 17 07:54:54.493620 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:54.493563 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.813327417 podStartE2EDuration="6.493541527s" podCreationTimestamp="2026-04-17 07:54:48 +0000 UTC" firstStartedPulling="2026-04-17 07:54:48.786434569 +0000 UTC m=+182.496038068" lastFinishedPulling="2026-04-17 07:54:53.466648684 +0000 UTC m=+187.176252178" observedRunningTime="2026-04-17 07:54:54.49283915 +0000 UTC m=+188.202442642" watchObservedRunningTime="2026-04-17 07:54:54.493541527 +0000 UTC m=+188.203145045" Apr 17 07:54:55.778661 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:55.778602 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" podUID="d6a1cde0-f02d-40ca-91b3-342b21f46121" containerName="registry" containerID="cri-o://dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb" gracePeriod=30 Apr 17 07:54:56.033275 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.033198 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:56.165228 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165187 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-trusted-ca\") pod \"d6a1cde0-f02d-40ca-91b3-342b21f46121\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " Apr 17 07:54:56.165410 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165301 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6a1cde0-f02d-40ca-91b3-342b21f46121-ca-trust-extracted\") pod \"d6a1cde0-f02d-40ca-91b3-342b21f46121\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " Apr 17 07:54:56.165410 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165352 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") pod \"d6a1cde0-f02d-40ca-91b3-342b21f46121\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " Apr 17 07:54:56.165410 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165391 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-bound-sa-token\") pod \"d6a1cde0-f02d-40ca-91b3-342b21f46121\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " Apr 17 07:54:56.165559 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165426 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-certificates\") pod \"d6a1cde0-f02d-40ca-91b3-342b21f46121\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " Apr 17 07:54:56.165559 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165456 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-installation-pull-secrets\") pod \"d6a1cde0-f02d-40ca-91b3-342b21f46121\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " Apr 17 07:54:56.165559 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165488 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mxv6\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-kube-api-access-9mxv6\") pod \"d6a1cde0-f02d-40ca-91b3-342b21f46121\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " Apr 17 07:54:56.165559 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165522 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-image-registry-private-configuration\") pod \"d6a1cde0-f02d-40ca-91b3-342b21f46121\" (UID: \"d6a1cde0-f02d-40ca-91b3-342b21f46121\") " Apr 17 07:54:56.165743 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165688 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d6a1cde0-f02d-40ca-91b3-342b21f46121" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:56.165815 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165796 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-trusted-ca\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:54:56.165932 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.165873 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d6a1cde0-f02d-40ca-91b3-342b21f46121" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:56.168437 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.168392 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d6a1cde0-f02d-40ca-91b3-342b21f46121" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:56.168613 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.168586 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-kube-api-access-9mxv6" (OuterVolumeSpecName: "kube-api-access-9mxv6") pod "d6a1cde0-f02d-40ca-91b3-342b21f46121" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121"). InnerVolumeSpecName "kube-api-access-9mxv6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:56.168856 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.168823 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d6a1cde0-f02d-40ca-91b3-342b21f46121" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:56.168970 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.168846 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d6a1cde0-f02d-40ca-91b3-342b21f46121" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:56.168970 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.168820 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d6a1cde0-f02d-40ca-91b3-342b21f46121" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:56.177694 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.177670 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a1cde0-f02d-40ca-91b3-342b21f46121-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d6a1cde0-f02d-40ca-91b3-342b21f46121" (UID: "d6a1cde0-f02d-40ca-91b3-342b21f46121"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:54:56.266595 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.266560 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6a1cde0-f02d-40ca-91b3-342b21f46121-ca-trust-extracted\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:54:56.266595 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.266590 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-tls\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:54:56.266595 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.266600 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-bound-sa-token\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:54:56.266818 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.266609 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6a1cde0-f02d-40ca-91b3-342b21f46121-registry-certificates\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:54:56.266818 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.266619 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-installation-pull-secrets\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:54:56.266818 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.266632 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9mxv6\" (UniqueName: \"kubernetes.io/projected/d6a1cde0-f02d-40ca-91b3-342b21f46121-kube-api-access-9mxv6\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:54:56.266818 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.266647 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6a1cde0-f02d-40ca-91b3-342b21f46121-image-registry-private-configuration\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:54:56.470322 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.470282 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6a1cde0-f02d-40ca-91b3-342b21f46121" containerID="dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb" exitCode=0 Apr 17 07:54:56.470521 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.470332 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" event={"ID":"d6a1cde0-f02d-40ca-91b3-342b21f46121","Type":"ContainerDied","Data":"dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb"} Apr 17 07:54:56.470521 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.470355 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" Apr 17 07:54:56.470521 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.470365 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558bd67658-kz8wz" event={"ID":"d6a1cde0-f02d-40ca-91b3-342b21f46121","Type":"ContainerDied","Data":"168490dc687a01c78476d3a7134ccf4c69515a09a1549511d1d44ebb1c2db46c"} Apr 17 07:54:56.470521 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.470385 2579 scope.go:117] "RemoveContainer" containerID="dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb" Apr 17 07:54:56.480298 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.480266 2579 scope.go:117] "RemoveContainer" containerID="dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb" Apr 17 07:54:56.480589 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:54:56.480557 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb\": container with ID starting with dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb not found: ID does not exist" containerID="dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb" Apr 17 07:54:56.480694 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.480601 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb"} err="failed to get container status \"dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb\": rpc error: code = NotFound desc = could not find container \"dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb\": container with ID starting with dffcebf3cb3cedbe3704ab548e8e35b077bf42d9b1372e09359ef19b3ef918cb not found: ID does not exist" Apr 17 07:54:56.492917 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.492878 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-558bd67658-kz8wz"] Apr 17 07:54:56.496493 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.496448 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-558bd67658-kz8wz"] Apr 17 07:54:56.882372 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:56.882339 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a1cde0-f02d-40ca-91b3-342b21f46121" path="/var/lib/kubelet/pods/d6a1cde0-f02d-40ca-91b3-342b21f46121/volumes" Apr 17 07:54:58.600074 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:54:58.600042 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:05.501395 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:05.501360 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6d49j" event={"ID":"7cd30c29-08d2-4de4-9362-0464cd619d23","Type":"ContainerStarted","Data":"9cd2329a3a03c81100f269f8201f69e4be568965da901d06dc16c0b558086927"} Apr 17 07:55:05.501866 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:05.501572 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-6d49j" Apr 17 07:55:05.517096 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:05.517067 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-6d49j" Apr 17 07:55:05.521239 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:05.521187 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-6d49j" podStartSLOduration=2.000434541 podStartE2EDuration="18.521170561s" podCreationTimestamp="2026-04-17 07:54:47 +0000 UTC" firstStartedPulling="2026-04-17 07:54:48.182663803 +0000 UTC m=+181.892267297" lastFinishedPulling="2026-04-17 07:55:04.703399809 +0000 UTC m=+198.413003317" observedRunningTime="2026-04-17 07:55:05.518713529 +0000 UTC m=+199.228317046" watchObservedRunningTime="2026-04-17 07:55:05.521170561 +0000 UTC m=+199.230774075" Apr 17 07:55:26.607182 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:26.607145 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lrw8v_61a4efa4-9ac5-47c3-ba8a-6fa191936c56/dns/0.log" Apr 17 07:55:26.612854 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:26.612826 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lrw8v_61a4efa4-9ac5-47c3-ba8a-6fa191936c56/kube-rbac-proxy/0.log" Apr 17 07:55:26.787619 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:26.787590 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-62tj8_e5343931-acc9-4a96-81bd-fc6bbad4d9be/dns-node-resolver/0.log" Apr 17 07:55:48.599509 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:48.599474 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:48.667157 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:48.667117 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:48.682463 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:48.682443 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:58.590036 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:58.589983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:55:58.592401 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:58.592382 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b638776-5000-47ee-92c9-8ba7655f560c-metrics-certs\") pod \"network-metrics-daemon-qq9zp\" (UID: \"2b638776-5000-47ee-92c9-8ba7655f560c\") " pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:55:58.782669 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:58.782639 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqz2q\"" Apr 17 07:55:58.791318 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:58.791282 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qq9zp" Apr 17 07:55:58.910927 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:58.910896 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qq9zp"] Apr 17 07:55:58.914300 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:55:58.914272 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b638776_5000_47ee_92c9_8ba7655f560c.slice/crio-12ee4ba8a231015cb9ccd33ddd384424699dbb7a1ceeacd159ad7fcea0231b45 WatchSource:0}: Error finding container 12ee4ba8a231015cb9ccd33ddd384424699dbb7a1ceeacd159ad7fcea0231b45: Status 404 returned error can't find the container with id 12ee4ba8a231015cb9ccd33ddd384424699dbb7a1ceeacd159ad7fcea0231b45 Apr 17 07:55:59.668652 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:55:59.668612 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qq9zp" event={"ID":"2b638776-5000-47ee-92c9-8ba7655f560c","Type":"ContainerStarted","Data":"12ee4ba8a231015cb9ccd33ddd384424699dbb7a1ceeacd159ad7fcea0231b45"} Apr 17 07:56:00.673843 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:00.673806 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qq9zp" event={"ID":"2b638776-5000-47ee-92c9-8ba7655f560c","Type":"ContainerStarted","Data":"31ad8bc933eec7c3cf2b1614f8743e204daeda88cd579437ed916fc175017a14"} Apr 17 07:56:00.673843 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:00.673844 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qq9zp" event={"ID":"2b638776-5000-47ee-92c9-8ba7655f560c","Type":"ContainerStarted","Data":"e13a7653242d03e20156f5dc807bf32b3c4d4846a6d6494f212d3d09c6283e61"} Apr 17 07:56:00.700106 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:00.700049 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qq9zp" podStartSLOduration=253.703995277 podStartE2EDuration="4m14.700032621s" podCreationTimestamp="2026-04-17 07:51:46 +0000 UTC" firstStartedPulling="2026-04-17 07:55:58.916105254 +0000 UTC m=+252.625708747" lastFinishedPulling="2026-04-17 07:55:59.912142585 +0000 UTC m=+253.621746091" observedRunningTime="2026-04-17 07:56:00.698973024 +0000 UTC m=+254.408576560" watchObservedRunningTime="2026-04-17 07:56:00.700032621 +0000 UTC m=+254.409636137" Apr 17 07:56:06.609009 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.608974 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:06.609600 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.609563 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="prometheus" containerID="cri-o://b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" gracePeriod=600 Apr 17 07:56:06.609680 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.609602 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy" containerID="cri-o://0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" gracePeriod=600 Apr 17 07:56:06.609733 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.609645 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="thanos-sidecar" containerID="cri-o://cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" gracePeriod=600 Apr 17 07:56:06.609733 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.609697 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy-web" containerID="cri-o://e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" gracePeriod=600 Apr 17 07:56:06.609824 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.609727 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy-thanos" containerID="cri-o://3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" gracePeriod=600 Apr 17 07:56:06.609824 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.609649 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="config-reloader" containerID="cri-o://c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" gracePeriod=600 Apr 17 07:56:06.871995 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.871968 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:06.956874 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.956843 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-tls\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.956874 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.956896 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-web-config\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957125 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.956924 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-trusted-ca-bundle\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957125 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.956941 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-metrics-client-certs\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957125 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.956969 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-tls-assets\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957125 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.956992 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-grpc-tls\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957125 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957018 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7557c\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-kube-api-access-7557c\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957125 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957060 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-kubelet-serving-ca-bundle\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957153 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957182 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-db\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957222 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config-out\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957252 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-thanos-prometheus-http-client-file\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957285 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957311 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-rulefiles-0\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957345 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957388 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-serving-certs-ca-bundle\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.957448 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957436 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-kube-rbac-proxy\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.958087 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957489 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-metrics-client-ca\") pod \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\" (UID: \"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff\") " Apr 17 07:56:06.958087 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957609 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:06.958087 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957633 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:06.958087 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.957961 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:06.958286 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.958242 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:06.958341 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.958282 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:06.958341 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.958308 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-trusted-ca-bundle\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:06.958341 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.958325 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:06.960216 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.960184 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config" (OuterVolumeSpecName: "config") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:06.960402 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.960216 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:06.960483 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.960409 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:06.961505 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.961415 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:06.961965 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.961930 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:06.961965 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.961931 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:06.962240 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.962212 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:06.962240 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.962229 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:06.962420 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.962395 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config-out" (OuterVolumeSpecName: "config-out") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:06.962648 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.962622 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-kube-api-access-7557c" (OuterVolumeSpecName: "kube-api-access-7557c") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "kube-api-access-7557c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:06.962794 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.962769 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:06.963194 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.963174 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:06.963344 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.963325 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:06.972288 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:06.972266 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-web-config" (OuterVolumeSpecName: "web-config") pod "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" (UID: "0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:07.059517 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059484 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config-out\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059517 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059511 2579 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-thanos-prometheus-http-client-file\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059517 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059522 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059533 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059542 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059553 2579 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-kube-rbac-proxy\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059561 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-configmap-metrics-client-ca\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059570 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-prometheus-k8s-tls\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059579 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-web-config\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059587 2579 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-metrics-client-certs\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059596 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-tls-assets\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059605 2579 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-secret-grpc-tls\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059613 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7557c\" (UniqueName: \"kubernetes.io/projected/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-kube-api-access-7557c\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059621 2579 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-config\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.059722 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.059629 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff-prometheus-k8s-db\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 07:56:07.696966 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.696933 2579 generic.go:358] "Generic (PLEG): container finished" podID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" exitCode=0 Apr 17 07:56:07.696966 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.696957 2579 generic.go:358] "Generic (PLEG): container finished" podID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" exitCode=0 Apr 17 07:56:07.696966 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.696963 2579 generic.go:358] "Generic (PLEG): container finished" podID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" exitCode=0 Apr 17 07:56:07.696966 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.696968 2579 generic.go:358] "Generic (PLEG): container finished" podID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" exitCode=0 Apr 17 07:56:07.696966 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.696973 2579 generic.go:358] "Generic (PLEG): container finished" podID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" exitCode=0 Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.696978 2579 generic.go:358] "Generic (PLEG): container finished" podID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" exitCode=0 Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697015 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerDied","Data":"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316"} Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697039 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697056 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerDied","Data":"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0"} Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697078 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerDied","Data":"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e"} Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697087 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerDied","Data":"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee"} Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697097 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerDied","Data":"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268"} Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697107 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerDied","Data":"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab"} Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff","Type":"ContainerDied","Data":"e81bd8c166f2d2a90f80fcb1b3caaf5bbe320dcb7dc208cd7d04678f4aa01519"} Apr 17 07:56:07.697462 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.697119 2579 scope.go:117] "RemoveContainer" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" Apr 17 07:56:07.705115 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.705101 2579 scope.go:117] "RemoveContainer" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" Apr 17 07:56:07.714259 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.714235 2579 scope.go:117] "RemoveContainer" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" Apr 17 07:56:07.720157 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.720135 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:07.721175 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.721159 2579 scope.go:117] "RemoveContainer" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" Apr 17 07:56:07.725183 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.725160 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:07.728573 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.728557 2579 scope.go:117] "RemoveContainer" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" Apr 17 07:56:07.734848 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.734833 2579 scope.go:117] "RemoveContainer" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" Apr 17 07:56:07.741539 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.741522 2579 scope.go:117] "RemoveContainer" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" Apr 17 07:56:07.747701 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.747685 2579 scope.go:117] "RemoveContainer" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" Apr 17 07:56:07.747945 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:56:07.747928 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": container with ID starting with 3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316 not found: ID does not exist" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" Apr 17 07:56:07.747999 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.747953 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316"} err="failed to get container status \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": rpc error: code = NotFound desc = could not find container \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": container with ID starting with 3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316 not found: ID does not exist" Apr 17 07:56:07.747999 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.747974 2579 scope.go:117] "RemoveContainer" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" Apr 17 07:56:07.748159 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:56:07.748141 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": container with ID starting with 0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0 not found: ID does not exist" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" Apr 17 07:56:07.748220 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.748168 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0"} err="failed to get container status \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": rpc error: code = NotFound desc = could not find container \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": container with ID starting with 0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0 not found: ID does not exist" Apr 17 07:56:07.748220 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.748191 2579 scope.go:117] "RemoveContainer" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" Apr 17 07:56:07.748427 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:56:07.748412 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": container with ID starting with e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e not found: ID does not exist" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" Apr 17 07:56:07.748466 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.748432 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e"} err="failed to get container status \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": rpc error: code = NotFound desc = could not find container \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": container with ID starting with e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e not found: ID does not exist" Apr 17 07:56:07.748466 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.748445 2579 scope.go:117] "RemoveContainer" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" Apr 17 07:56:07.748630 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:56:07.748617 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": container with ID starting with cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee not found: ID does not exist" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" Apr 17 07:56:07.748664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.748634 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee"} err="failed to get container status \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": rpc error: code = NotFound desc = could not find container \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": container with ID starting with cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee not found: ID does not exist" Apr 17 07:56:07.748664 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.748645 2579 scope.go:117] "RemoveContainer" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" Apr 17 07:56:07.748836 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:56:07.748819 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": container with ID starting with c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268 not found: ID does not exist" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" Apr 17 07:56:07.748919 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.748842 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268"} err="failed to get container status \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": rpc error: code = NotFound desc = could not find container \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": container with ID starting with c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268 not found: ID does not exist" Apr 17 07:56:07.748919 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.748861 2579 scope.go:117] "RemoveContainer" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" Apr 17 07:56:07.749128 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:56:07.749111 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": container with ID starting with b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab not found: ID does not exist" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" Apr 17 07:56:07.749173 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.749142 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab"} err="failed to get container status \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": rpc error: code = NotFound desc = could not find container \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": container with ID starting with b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab not found: ID does not exist" Apr 17 07:56:07.749173 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.749157 2579 scope.go:117] "RemoveContainer" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" Apr 17 07:56:07.749395 ip-10-0-142-45 kubenswrapper[2579]: E0417 07:56:07.749379 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": container with ID starting with e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a not found: ID does not exist" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" Apr 17 07:56:07.749437 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.749398 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a"} err="failed to get container status \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": rpc error: code = NotFound desc = could not find container \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": container with ID starting with e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a not found: ID does not exist" Apr 17 07:56:07.749437 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.749412 2579 scope.go:117] "RemoveContainer" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" Apr 17 07:56:07.749607 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.749589 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316"} err="failed to get container status \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": rpc error: code = NotFound desc = could not find container \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": container with ID starting with 3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316 not found: ID does not exist" Apr 17 07:56:07.749676 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.749609 2579 scope.go:117] "RemoveContainer" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" Apr 17 07:56:07.749836 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.749818 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0"} err="failed to get container status \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": rpc error: code = NotFound desc = could not find container \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": container with ID starting with 0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0 not found: ID does not exist" Apr 17 07:56:07.749955 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.749837 2579 scope.go:117] "RemoveContainer" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" Apr 17 07:56:07.750032 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.750014 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e"} err="failed to get container status \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": rpc error: code = NotFound desc = could not find container \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": container with ID starting with e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e not found: ID does not exist" Apr 17 07:56:07.750083 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.750034 2579 scope.go:117] "RemoveContainer" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" Apr 17 07:56:07.750262 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.750242 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee"} err="failed to get container status \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": rpc error: code = NotFound desc = could not find container \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": container with ID starting with cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee not found: ID does not exist" Apr 17 07:56:07.750262 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.750261 2579 scope.go:117] "RemoveContainer" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" Apr 17 07:56:07.750486 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.750467 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268"} err="failed to get container status \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": rpc error: code = NotFound desc = could not find container \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": container with ID starting with c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268 not found: ID does not exist" Apr 17 07:56:07.750532 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.750489 2579 scope.go:117] "RemoveContainer" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" Apr 17 07:56:07.750716 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.750701 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab"} err="failed to get container status \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": rpc error: code = NotFound desc = could not find container \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": container with ID starting with b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab not found: ID does not exist" Apr 17 07:56:07.750716 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.750716 2579 scope.go:117] "RemoveContainer" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" Apr 17 07:56:07.751022 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751004 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a"} err="failed to get container status \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": rpc error: code = NotFound desc = could not find container \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": container with ID starting with e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a not found: ID does not exist" Apr 17 07:56:07.751074 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751023 2579 scope.go:117] "RemoveContainer" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" Apr 17 07:56:07.751225 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751208 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316"} err="failed to get container status \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": rpc error: code = NotFound desc = could not find container \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": container with ID starting with 3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316 not found: ID does not exist" Apr 17 07:56:07.751265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751226 2579 scope.go:117] "RemoveContainer" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" Apr 17 07:56:07.751426 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751411 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0"} err="failed to get container status \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": rpc error: code = NotFound desc = could not find container \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": container with ID starting with 0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0 not found: ID does not exist" Apr 17 07:56:07.751476 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751427 2579 scope.go:117] "RemoveContainer" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" Apr 17 07:56:07.751609 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751593 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e"} err="failed to get container status \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": rpc error: code = NotFound desc = could not find container \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": container with ID starting with e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e not found: ID does not exist" Apr 17 07:56:07.751653 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751609 2579 scope.go:117] "RemoveContainer" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" Apr 17 07:56:07.751770 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751754 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee"} err="failed to get container status \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": rpc error: code = NotFound desc = could not find container \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": container with ID starting with cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee not found: ID does not exist" Apr 17 07:56:07.751877 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751769 2579 scope.go:117] "RemoveContainer" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" Apr 17 07:56:07.752009 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.751993 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268"} err="failed to get container status \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": rpc error: code = NotFound desc = could not find container \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": container with ID starting with c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268 not found: ID does not exist" Apr 17 07:56:07.752091 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.752010 2579 scope.go:117] "RemoveContainer" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" Apr 17 07:56:07.752253 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.752237 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab"} err="failed to get container status \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": rpc error: code = NotFound desc = could not find container \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": container with ID starting with b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab not found: ID does not exist" Apr 17 07:56:07.752253 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.752252 2579 scope.go:117] "RemoveContainer" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" Apr 17 07:56:07.752497 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.752470 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a"} err="failed to get container status \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": rpc error: code = NotFound desc = could not find container \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": container with ID starting with e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a not found: ID does not exist" Apr 17 07:56:07.752497 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.752497 2579 scope.go:117] "RemoveContainer" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" Apr 17 07:56:07.752943 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.752914 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316"} err="failed to get container status \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": rpc error: code = NotFound desc = could not find container \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": container with ID starting with 3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316 not found: ID does not exist" Apr 17 07:56:07.753032 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.752945 2579 scope.go:117] "RemoveContainer" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" Apr 17 07:56:07.753284 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.753257 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0"} err="failed to get container status \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": rpc error: code = NotFound desc = could not find container \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": container with ID starting with 0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0 not found: ID does not exist" Apr 17 07:56:07.753355 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.753290 2579 scope.go:117] "RemoveContainer" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" Apr 17 07:56:07.753578 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.753555 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e"} err="failed to get container status \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": rpc error: code = NotFound desc = could not find container \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": container with ID starting with e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e not found: ID does not exist" Apr 17 07:56:07.753659 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.753581 2579 scope.go:117] "RemoveContainer" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" Apr 17 07:56:07.753903 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.753855 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee"} err="failed to get container status \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": rpc error: code = NotFound desc = could not find container \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": container with ID starting with cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee not found: ID does not exist" Apr 17 07:56:07.753993 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.753904 2579 scope.go:117] "RemoveContainer" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" Apr 17 07:56:07.754119 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754099 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:07.754261 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754208 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268"} err="failed to get container status \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": rpc error: code = NotFound desc = could not find container \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": container with ID starting with c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268 not found: ID does not exist" Apr 17 07:56:07.754261 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754226 2579 scope.go:117] "RemoveContainer" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" Apr 17 07:56:07.754423 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754405 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy" Apr 17 07:56:07.754423 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754423 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754433 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="prometheus" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754439 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="prometheus" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754446 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy-web" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754453 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy-web" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754473 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6a1cde0-f02d-40ca-91b3-342b21f46121" containerName="registry" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754478 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a1cde0-f02d-40ca-91b3-342b21f46121" containerName="registry" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754486 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="thanos-sidecar" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754490 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="thanos-sidecar" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754487 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab"} err="failed to get container status \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": rpc error: code = NotFound desc = could not find container \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": container with ID starting with b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab not found: ID does not exist" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754497 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="config-reloader" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754502 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="config-reloader" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754502 2579 scope.go:117] "RemoveContainer" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754509 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="init-config-reloader" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754514 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="init-config-reloader" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754526 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy-thanos" Apr 17 07:56:07.754555 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754533 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy-thanos" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754596 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy-thanos" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754604 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy-web" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754614 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="config-reloader" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754624 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6a1cde0-f02d-40ca-91b3-342b21f46121" containerName="registry" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754637 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="thanos-sidecar" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754647 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="prometheus" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754657 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" containerName="kube-rbac-proxy" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754746 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a"} err="failed to get container status \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": rpc error: code = NotFound desc = could not find container \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": container with ID starting with e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a not found: ID does not exist" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.754767 2579 scope.go:117] "RemoveContainer" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.755039 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316"} err="failed to get container status \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": rpc error: code = NotFound desc = could not find container \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": container with ID starting with 3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316 not found: ID does not exist" Apr 17 07:56:07.755195 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.755062 2579 scope.go:117] "RemoveContainer" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" Apr 17 07:56:07.755581 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.755304 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0"} err="failed to get container status \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": rpc error: code = NotFound desc = could not find container \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": container with ID starting with 0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0 not found: ID does not exist" Apr 17 07:56:07.755581 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.755324 2579 scope.go:117] "RemoveContainer" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" Apr 17 07:56:07.755581 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.755547 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e"} err="failed to get container status \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": rpc error: code = NotFound desc = could not find container \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": container with ID starting with e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e not found: ID does not exist" Apr 17 07:56:07.755581 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.755564 2579 scope.go:117] "RemoveContainer" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" Apr 17 07:56:07.755780 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.755761 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee"} err="failed to get container status \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": rpc error: code = NotFound desc = could not find container \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": container with ID starting with cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee not found: ID does not exist" Apr 17 07:56:07.755840 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.755783 2579 scope.go:117] "RemoveContainer" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" Apr 17 07:56:07.756069 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756042 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268"} err="failed to get container status \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": rpc error: code = NotFound desc = could not find container \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": container with ID starting with c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268 not found: ID does not exist" Apr 17 07:56:07.756069 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756068 2579 scope.go:117] "RemoveContainer" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" Apr 17 07:56:07.756302 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756286 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab"} err="failed to get container status \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": rpc error: code = NotFound desc = could not find container \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": container with ID starting with b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab not found: ID does not exist" Apr 17 07:56:07.756345 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756304 2579 scope.go:117] "RemoveContainer" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" Apr 17 07:56:07.756501 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756484 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a"} err="failed to get container status \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": rpc error: code = NotFound desc = could not find container \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": container with ID starting with e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a not found: ID does not exist" Apr 17 07:56:07.756559 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756502 2579 scope.go:117] "RemoveContainer" containerID="3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316" Apr 17 07:56:07.756688 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756668 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316"} err="failed to get container status \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": rpc error: code = NotFound desc = could not find container \"3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316\": container with ID starting with 3abafaccbd73b02375e08753636eec57394df7b4b973481f5869ebdcc2668316 not found: ID does not exist" Apr 17 07:56:07.756759 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756693 2579 scope.go:117] "RemoveContainer" containerID="0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0" Apr 17 07:56:07.756949 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756875 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0"} err="failed to get container status \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": rpc error: code = NotFound desc = could not find container \"0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0\": container with ID starting with 0d1109707a91adb8d74fe01f45f9eb796f121c8029f2829ed6e919955585bda0 not found: ID does not exist" Apr 17 07:56:07.756949 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.756914 2579 scope.go:117] "RemoveContainer" containerID="e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e" Apr 17 07:56:07.757154 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.757091 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e"} err="failed to get container status \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": rpc error: code = NotFound desc = could not find container \"e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e\": container with ID starting with e69c8af11ea9f3b0ece1748e57301d46b667424630f46142af5a5d367c96149e not found: ID does not exist" Apr 17 07:56:07.757154 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.757115 2579 scope.go:117] "RemoveContainer" containerID="cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee" Apr 17 07:56:07.757675 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.757650 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee"} err="failed to get container status \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": rpc error: code = NotFound desc = could not find container \"cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee\": container with ID starting with cd07b4711af80c304fdb07aec7b451f3be00c5097f11160eec03c556939aa2ee not found: ID does not exist" Apr 17 07:56:07.757675 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.757675 2579 scope.go:117] "RemoveContainer" containerID="c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268" Apr 17 07:56:07.758392 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.758372 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268"} err="failed to get container status \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": rpc error: code = NotFound desc = could not find container \"c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268\": container with ID starting with c80fa84251d6541a4fa725757aca58a9e1a9954a90772d2a882a83e3bae4c268 not found: ID does not exist" Apr 17 07:56:07.758542 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.758530 2579 scope.go:117] "RemoveContainer" containerID="b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab" Apr 17 07:56:07.758989 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.758912 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab"} err="failed to get container status \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": rpc error: code = NotFound desc = could not find container \"b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab\": container with ID starting with b44cd25e41f568e4d07d1035193505ddaa3110729be77d24db0072f0acc79bab not found: ID does not exist" Apr 17 07:56:07.758989 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.758935 2579 scope.go:117] "RemoveContainer" containerID="e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a" Apr 17 07:56:07.759439 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.759412 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a"} err="failed to get container status \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": rpc error: code = NotFound desc = could not find container \"e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a\": container with ID starting with e8689110048567e41452a7fad7b41207dd1d103482b1e935531379249893fb0a not found: ID does not exist" Apr 17 07:56:07.761463 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.761445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.764076 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764052 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 07:56:07.764076 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764067 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9bxsp\"" Apr 17 07:56:07.764368 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764067 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 07:56:07.764368 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764013 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 07:56:07.764551 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764531 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 07:56:07.764626 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764567 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 07:56:07.764626 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764584 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 07:56:07.764718 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764650 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 07:56:07.764829 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764808 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 07:56:07.764937 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.764854 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:56:07.765018 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.765001 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 07:56:07.765162 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.765144 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-builu79s5mq0v\"" Apr 17 07:56:07.765162 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.765156 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 07:56:07.769212 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.769113 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 07:56:07.772837 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.772544 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 07:56:07.775999 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.775980 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:07.865687 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865657 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865687 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865720 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865775 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8gx6\" (UniqueName: \"kubernetes.io/projected/681625ef-d4f2-4499-83c8-2234ec8c6c18-kube-api-access-x8gx6\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865818 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865835 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865875 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/681625ef-d4f2-4499-83c8-2234ec8c6c18-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865922 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.865957 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.866265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.866265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865981 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.866265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.865998 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.866265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.866030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/681625ef-d4f2-4499-83c8-2234ec8c6c18-config-out\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.866265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.866069 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-web-config\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.866265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.866110 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-config\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.866265 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.866142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967407 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967447 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8gx6\" (UniqueName: \"kubernetes.io/projected/681625ef-d4f2-4499-83c8-2234ec8c6c18-kube-api-access-x8gx6\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967521 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/681625ef-d4f2-4499-83c8-2234ec8c6c18-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.967599 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968067 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968067 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967659 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968067 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968067 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967711 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968067 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967735 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/681625ef-d4f2-4499-83c8-2234ec8c6c18-config-out\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968067 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-web-config\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968067 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967789 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-config\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968067 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.967830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968472 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.968432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.968771 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.968744 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.969381 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.969316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.970527 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.969476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.970527 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.970080 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.971852 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.971771 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.971852 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.971838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.972144 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.972096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.972680 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.972579 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/681625ef-d4f2-4499-83c8-2234ec8c6c18-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.973016 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.972976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/681625ef-d4f2-4499-83c8-2234ec8c6c18-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.973016 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.972998 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-config\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.973365 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.973342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-web-config\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.973556 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.973537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.973556 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.973549 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.973830 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.973808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/681625ef-d4f2-4499-83c8-2234ec8c6c18-config-out\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.973963 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.973946 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.974390 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.974369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/681625ef-d4f2-4499-83c8-2234ec8c6c18-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.976535 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:07.976514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8gx6\" (UniqueName: \"kubernetes.io/projected/681625ef-d4f2-4499-83c8-2234ec8c6c18-kube-api-access-x8gx6\") pod \"prometheus-k8s-0\" (UID: \"681625ef-d4f2-4499-83c8-2234ec8c6c18\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:08.076334 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:08.076306 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:08.206483 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:08.206458 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:08.209444 ip-10-0-142-45 kubenswrapper[2579]: W0417 07:56:08.209418 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod681625ef_d4f2_4499_83c8_2234ec8c6c18.slice/crio-0658e224f5c127f67dbb428fb2e9a2b31636f94af0325fe57b537bdbd5348207 WatchSource:0}: Error finding container 0658e224f5c127f67dbb428fb2e9a2b31636f94af0325fe57b537bdbd5348207: Status 404 returned error can't find the container with id 0658e224f5c127f67dbb428fb2e9a2b31636f94af0325fe57b537bdbd5348207 Apr 17 07:56:08.702640 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:08.702611 2579 generic.go:358] "Generic (PLEG): container finished" podID="681625ef-d4f2-4499-83c8-2234ec8c6c18" containerID="272d0a927d624fd19ffec325ade20ce256e1e51edd36ee55e2c8c0abef781019" exitCode=0 Apr 17 07:56:08.703011 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:08.702652 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"681625ef-d4f2-4499-83c8-2234ec8c6c18","Type":"ContainerDied","Data":"272d0a927d624fd19ffec325ade20ce256e1e51edd36ee55e2c8c0abef781019"} Apr 17 07:56:08.703011 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:08.702671 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"681625ef-d4f2-4499-83c8-2234ec8c6c18","Type":"ContainerStarted","Data":"0658e224f5c127f67dbb428fb2e9a2b31636f94af0325fe57b537bdbd5348207"} Apr 17 07:56:08.884547 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:08.884514 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff" path="/var/lib/kubelet/pods/0064fe16-d7e9-4eb8-9fd6-d9a5cbcd91ff/volumes" Apr 17 07:56:09.707824 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:09.707785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"681625ef-d4f2-4499-83c8-2234ec8c6c18","Type":"ContainerStarted","Data":"9b45695b5ff1ae00a5e568e5dc1dfaa706d0fcdfa6d8e4845cde30edc551542d"} Apr 17 07:56:09.707824 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:09.707828 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"681625ef-d4f2-4499-83c8-2234ec8c6c18","Type":"ContainerStarted","Data":"48972a94b0a790b94129000436c4654811739b28195439ee63831fbc53f0e11d"} Apr 17 07:56:09.708242 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:09.707843 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"681625ef-d4f2-4499-83c8-2234ec8c6c18","Type":"ContainerStarted","Data":"ae95c369188655a57ca2f8640ef4bb0b9f309b3fe57627444cf7e75802773af2"} Apr 17 07:56:09.708242 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:09.707855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"681625ef-d4f2-4499-83c8-2234ec8c6c18","Type":"ContainerStarted","Data":"5e060de15ed6db609a9b22f584408b4a3ef4e3171a52cb700472defafe8908ce"} Apr 17 07:56:09.708242 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:09.707867 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"681625ef-d4f2-4499-83c8-2234ec8c6c18","Type":"ContainerStarted","Data":"22eb586c0e60306de4c8dc97edb636d17683916b27cdfac0feb2061240b493af"} Apr 17 07:56:09.708242 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:09.707907 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"681625ef-d4f2-4499-83c8-2234ec8c6c18","Type":"ContainerStarted","Data":"3ae47effc0243d1db6660567e1aa0239d2b84a7b4b722681f86a0ba16ccac283"} Apr 17 07:56:09.734354 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:09.734314 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.734295591 podStartE2EDuration="2.734295591s" podCreationTimestamp="2026-04-17 07:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:56:09.732825061 +0000 UTC m=+263.442428572" watchObservedRunningTime="2026-04-17 07:56:09.734295591 +0000 UTC m=+263.443899105" Apr 17 07:56:13.076721 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:13.076685 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:46.759587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:46.759557 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 07:56:46.759587 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:46.759570 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 07:56:46.763099 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:46.763076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 07:56:46.763297 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:46.763276 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 07:56:46.769948 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:56:46.769934 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 07:57:08.077068 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:57:08.077031 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:57:08.093209 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:57:08.093183 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:57:08.901453 ip-10-0-142-45 kubenswrapper[2579]: I0417 07:57:08.901423 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:00:15.695226 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.695191 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4zw5q"] Apr 17 08:00:15.698500 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.698478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.700523 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.700500 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 08:00:15.706011 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.705989 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4zw5q"] Apr 17 08:00:15.713531 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.713504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d5adbb5-8912-4dd9-9490-bc113967f040-dbus\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.713711 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.713692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d5adbb5-8912-4dd9-9490-bc113967f040-kubelet-config\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.713859 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.713833 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d5adbb5-8912-4dd9-9490-bc113967f040-original-pull-secret\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.814523 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.814482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d5adbb5-8912-4dd9-9490-bc113967f040-original-pull-secret\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.814705 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.814560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d5adbb5-8912-4dd9-9490-bc113967f040-dbus\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.814705 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.814584 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d5adbb5-8912-4dd9-9490-bc113967f040-kubelet-config\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.814705 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.814661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d5adbb5-8912-4dd9-9490-bc113967f040-kubelet-config\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.814826 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.814743 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d5adbb5-8912-4dd9-9490-bc113967f040-dbus\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:15.817036 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:15.817015 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d5adbb5-8912-4dd9-9490-bc113967f040-original-pull-secret\") pod \"global-pull-secret-syncer-4zw5q\" (UID: \"8d5adbb5-8912-4dd9-9490-bc113967f040\") " pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:16.008443 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:16.008348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zw5q" Apr 17 08:00:16.130830 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:16.130709 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4zw5q"] Apr 17 08:00:16.133464 ip-10-0-142-45 kubenswrapper[2579]: W0417 08:00:16.133435 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d5adbb5_8912_4dd9_9490_bc113967f040.slice/crio-38489015735d16a0f794f9c569f9954f4912a3ecf7bf85a212bdc1b77ec8eba9 WatchSource:0}: Error finding container 38489015735d16a0f794f9c569f9954f4912a3ecf7bf85a212bdc1b77ec8eba9: Status 404 returned error can't find the container with id 38489015735d16a0f794f9c569f9954f4912a3ecf7bf85a212bdc1b77ec8eba9 Apr 17 08:00:16.135057 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:16.135041 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:00:16.422226 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:16.422192 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4zw5q" event={"ID":"8d5adbb5-8912-4dd9-9490-bc113967f040","Type":"ContainerStarted","Data":"38489015735d16a0f794f9c569f9954f4912a3ecf7bf85a212bdc1b77ec8eba9"} Apr 17 08:00:20.436107 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:20.436066 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4zw5q" event={"ID":"8d5adbb5-8912-4dd9-9490-bc113967f040","Type":"ContainerStarted","Data":"1b70959297be23ca201b3d314ba5cb7de32546383837ef2cfd32fa0128df802a"} Apr 17 08:00:20.454027 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:00:20.453980 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4zw5q" podStartSLOduration=1.620480548 podStartE2EDuration="5.453964879s" podCreationTimestamp="2026-04-17 08:00:15 +0000 UTC" firstStartedPulling="2026-04-17 08:00:16.135161319 +0000 UTC m=+509.844764811" lastFinishedPulling="2026-04-17 08:00:19.968645652 +0000 UTC m=+513.678249142" observedRunningTime="2026-04-17 08:00:20.45255095 +0000 UTC m=+514.162154462" watchObservedRunningTime="2026-04-17 08:00:20.453964879 +0000 UTC m=+514.163568392" Apr 17 08:01:46.785157 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:01:46.785127 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:01:46.786054 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:01:46.786032 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:01:46.788205 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:01:46.788172 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:01:46.788870 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:01:46.788847 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:04:04.186860 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.186818 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-czjmb"] Apr 17 08:04:04.190212 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.190191 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-czjmb" Apr 17 08:04:04.192325 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.192306 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 08:04:04.192421 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.192346 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 08:04:04.192877 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.192861 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 08:04:04.192974 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.192863 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-brlbt\"" Apr 17 08:04:04.196021 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.196002 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-czjmb"] Apr 17 08:04:04.294504 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.294446 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbdc\" (UniqueName: \"kubernetes.io/projected/1e5521e4-05bf-4dda-94fb-09c1400e8ffd-kube-api-access-lwbdc\") pod \"s3-init-czjmb\" (UID: \"1e5521e4-05bf-4dda-94fb-09c1400e8ffd\") " pod="kserve/s3-init-czjmb" Apr 17 08:04:04.395825 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.395793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbdc\" (UniqueName: \"kubernetes.io/projected/1e5521e4-05bf-4dda-94fb-09c1400e8ffd-kube-api-access-lwbdc\") pod \"s3-init-czjmb\" (UID: \"1e5521e4-05bf-4dda-94fb-09c1400e8ffd\") " pod="kserve/s3-init-czjmb" Apr 17 08:04:04.403450 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.403426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbdc\" (UniqueName: \"kubernetes.io/projected/1e5521e4-05bf-4dda-94fb-09c1400e8ffd-kube-api-access-lwbdc\") pod \"s3-init-czjmb\" (UID: \"1e5521e4-05bf-4dda-94fb-09c1400e8ffd\") " pod="kserve/s3-init-czjmb" Apr 17 08:04:04.511167 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.511091 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-czjmb" Apr 17 08:04:04.640255 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:04.640232 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-czjmb"] Apr 17 08:04:04.642859 ip-10-0-142-45 kubenswrapper[2579]: W0417 08:04:04.642830 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e5521e4_05bf_4dda_94fb_09c1400e8ffd.slice/crio-acf432a10c56fe3793c71797634f2c8055b9160ee1b6d8ece77f77d8f6c8dca8 WatchSource:0}: Error finding container acf432a10c56fe3793c71797634f2c8055b9160ee1b6d8ece77f77d8f6c8dca8: Status 404 returned error can't find the container with id acf432a10c56fe3793c71797634f2c8055b9160ee1b6d8ece77f77d8f6c8dca8 Apr 17 08:04:05.089625 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:05.089592 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-czjmb" event={"ID":"1e5521e4-05bf-4dda-94fb-09c1400e8ffd","Type":"ContainerStarted","Data":"acf432a10c56fe3793c71797634f2c8055b9160ee1b6d8ece77f77d8f6c8dca8"} Apr 17 08:04:10.106953 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:10.106917 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-czjmb" event={"ID":"1e5521e4-05bf-4dda-94fb-09c1400e8ffd","Type":"ContainerStarted","Data":"d1fe0e27034f6fd5daab435c870da5f73dce9ec46d01c17d12127659c75a9a23"} Apr 17 08:04:10.122039 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:10.121989 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-czjmb" podStartSLOduration=1.3233494160000001 podStartE2EDuration="6.121972945s" podCreationTimestamp="2026-04-17 08:04:04 +0000 UTC" firstStartedPulling="2026-04-17 08:04:04.644682794 +0000 UTC m=+738.354286299" lastFinishedPulling="2026-04-17 08:04:09.443306338 +0000 UTC m=+743.152909828" observedRunningTime="2026-04-17 08:04:10.120308182 +0000 UTC m=+743.829911696" watchObservedRunningTime="2026-04-17 08:04:10.121972945 +0000 UTC m=+743.831576457" Apr 17 08:04:13.116775 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:13.116744 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e5521e4-05bf-4dda-94fb-09c1400e8ffd" containerID="d1fe0e27034f6fd5daab435c870da5f73dce9ec46d01c17d12127659c75a9a23" exitCode=0 Apr 17 08:04:13.117166 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:13.116815 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-czjmb" event={"ID":"1e5521e4-05bf-4dda-94fb-09c1400e8ffd","Type":"ContainerDied","Data":"d1fe0e27034f6fd5daab435c870da5f73dce9ec46d01c17d12127659c75a9a23"} Apr 17 08:04:14.246918 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:14.246872 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-czjmb" Apr 17 08:04:14.283305 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:14.283277 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwbdc\" (UniqueName: \"kubernetes.io/projected/1e5521e4-05bf-4dda-94fb-09c1400e8ffd-kube-api-access-lwbdc\") pod \"1e5521e4-05bf-4dda-94fb-09c1400e8ffd\" (UID: \"1e5521e4-05bf-4dda-94fb-09c1400e8ffd\") " Apr 17 08:04:14.285469 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:14.285444 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5521e4-05bf-4dda-94fb-09c1400e8ffd-kube-api-access-lwbdc" (OuterVolumeSpecName: "kube-api-access-lwbdc") pod "1e5521e4-05bf-4dda-94fb-09c1400e8ffd" (UID: "1e5521e4-05bf-4dda-94fb-09c1400e8ffd"). InnerVolumeSpecName "kube-api-access-lwbdc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:04:14.384662 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:14.384631 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwbdc\" (UniqueName: \"kubernetes.io/projected/1e5521e4-05bf-4dda-94fb-09c1400e8ffd-kube-api-access-lwbdc\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 08:04:15.128169 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:15.128139 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-czjmb" Apr 17 08:04:15.128169 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:15.128149 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-czjmb" event={"ID":"1e5521e4-05bf-4dda-94fb-09c1400e8ffd","Type":"ContainerDied","Data":"acf432a10c56fe3793c71797634f2c8055b9160ee1b6d8ece77f77d8f6c8dca8"} Apr 17 08:04:15.128169 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:04:15.128177 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf432a10c56fe3793c71797634f2c8055b9160ee1b6d8ece77f77d8f6c8dca8" Apr 17 08:06:46.808730 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:06:46.808703 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:06:46.809729 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:06:46.809709 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:06:46.811841 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:06:46.811821 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:06:46.812589 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:06:46.812566 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:07:29.199205 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.199164 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc"] Apr 17 08:07:29.199918 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.199877 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e5521e4-05bf-4dda-94fb-09c1400e8ffd" containerName="s3-init" Apr 17 08:07:29.199971 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.199921 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5521e4-05bf-4dda-94fb-09c1400e8ffd" containerName="s3-init" Apr 17 08:07:29.200115 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.200104 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e5521e4-05bf-4dda-94fb-09c1400e8ffd" containerName="s3-init" Apr 17 08:07:29.202611 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.202585 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:29.205377 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.205347 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-247b1-kube-rbac-proxy-sar-config\"" Apr 17 08:07:29.205377 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.205388 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-k26cm\"" Apr 17 08:07:29.205614 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.205478 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-247b1-serving-cert\"" Apr 17 08:07:29.206089 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.206067 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:07:29.208722 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.208702 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc"] Apr 17 08:07:29.281594 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.281559 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-proxy-tls\") pod \"model-chainer-raw-247b1-67485f48df-xgskc\" (UID: \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\") " pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:29.281807 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.281650 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-openshift-service-ca-bundle\") pod \"model-chainer-raw-247b1-67485f48df-xgskc\" (UID: \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\") " pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:29.382348 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.382315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-proxy-tls\") pod \"model-chainer-raw-247b1-67485f48df-xgskc\" (UID: \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\") " pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:29.382518 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.382430 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-openshift-service-ca-bundle\") pod \"model-chainer-raw-247b1-67485f48df-xgskc\" (UID: \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\") " pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:29.383099 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.383082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-openshift-service-ca-bundle\") pod \"model-chainer-raw-247b1-67485f48df-xgskc\" (UID: \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\") " pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:29.384895 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.384868 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-proxy-tls\") pod \"model-chainer-raw-247b1-67485f48df-xgskc\" (UID: \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\") " pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:29.516038 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.515952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:29.637204 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.637169 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc"] Apr 17 08:07:29.640804 ip-10-0-142-45 kubenswrapper[2579]: W0417 08:07:29.640773 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae42a07_53f9_4ec5_a950_9a9dfc6ee74e.slice/crio-2f366f74526a8e18c2c9f4de87fc8f2ba2ba1bd2fcb4407907bf9df3d1a72f5d WatchSource:0}: Error finding container 2f366f74526a8e18c2c9f4de87fc8f2ba2ba1bd2fcb4407907bf9df3d1a72f5d: Status 404 returned error can't find the container with id 2f366f74526a8e18c2c9f4de87fc8f2ba2ba1bd2fcb4407907bf9df3d1a72f5d Apr 17 08:07:29.642513 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.642489 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:07:29.707844 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:29.707808 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" event={"ID":"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e","Type":"ContainerStarted","Data":"2f366f74526a8e18c2c9f4de87fc8f2ba2ba1bd2fcb4407907bf9df3d1a72f5d"} Apr 17 08:07:32.718231 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:32.718190 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" event={"ID":"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e","Type":"ContainerStarted","Data":"935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d"} Apr 17 08:07:32.718617 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:32.718308 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:32.734634 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:32.734585 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" podStartSLOduration=1.367310866 podStartE2EDuration="3.734558764s" podCreationTimestamp="2026-04-17 08:07:29 +0000 UTC" firstStartedPulling="2026-04-17 08:07:29.642615115 +0000 UTC m=+943.352218609" lastFinishedPulling="2026-04-17 08:07:32.009863016 +0000 UTC m=+945.719466507" observedRunningTime="2026-04-17 08:07:32.73322629 +0000 UTC m=+946.442829802" watchObservedRunningTime="2026-04-17 08:07:32.734558764 +0000 UTC m=+946.444162276" Apr 17 08:07:38.726984 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:38.726956 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:39.247498 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:39.247462 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc"] Apr 17 08:07:39.247731 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:39.247708 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" containerID="cri-o://935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d" gracePeriod=30 Apr 17 08:07:43.725771 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:43.725732 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:07:48.725788 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:48.725750 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:07:53.725630 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:53.725585 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:07:53.726193 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:53.725699 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:07:58.725832 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:07:58.725794 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:08:03.725449 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:03.725350 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:08:08.725656 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:08.725612 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:08:09.273411 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:08:09.273374 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae42a07_53f9_4ec5_a950_9a9dfc6ee74e.slice/crio-conmon-935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d.scope\": RecentStats: unable to find data in memory cache]" Apr 17 08:08:09.273516 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:08:09.273470 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae42a07_53f9_4ec5_a950_9a9dfc6ee74e.slice/crio-conmon-935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d.scope\": RecentStats: unable to find data in memory cache]" Apr 17 08:08:09.273576 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:08:09.273475 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae42a07_53f9_4ec5_a950_9a9dfc6ee74e.slice/crio-conmon-935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d.scope\": RecentStats: unable to find data in memory cache]" Apr 17 08:08:09.391400 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.391377 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:08:09.420933 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.420907 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-proxy-tls\") pod \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\" (UID: \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\") " Apr 17 08:08:09.421065 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.420975 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-openshift-service-ca-bundle\") pod \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\" (UID: \"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e\") " Apr 17 08:08:09.421311 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.421284 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" (UID: "0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:08:09.423236 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.423211 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" (UID: "0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:08:09.522170 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.522093 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-proxy-tls\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 08:08:09.522170 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.522120 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e-openshift-service-ca-bundle\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 08:08:09.828958 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.828856 2579 generic.go:358] "Generic (PLEG): container finished" podID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerID="935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d" exitCode=0 Apr 17 08:08:09.828958 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.828942 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" Apr 17 08:08:09.828958 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.828941 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" event={"ID":"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e","Type":"ContainerDied","Data":"935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d"} Apr 17 08:08:09.829511 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.828978 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc" event={"ID":"0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e","Type":"ContainerDied","Data":"2f366f74526a8e18c2c9f4de87fc8f2ba2ba1bd2fcb4407907bf9df3d1a72f5d"} Apr 17 08:08:09.829511 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.828993 2579 scope.go:117] "RemoveContainer" containerID="935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d" Apr 17 08:08:09.839956 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.839847 2579 scope.go:117] "RemoveContainer" containerID="935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d" Apr 17 08:08:09.840302 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:08:09.840277 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d\": container with ID starting with 935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d not found: ID does not exist" containerID="935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d" Apr 17 08:08:09.840383 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.840317 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d"} err="failed to get container status \"935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d\": rpc error: code = NotFound desc = could not find container \"935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d\": container with ID starting with 935867a0394a0edea1453099dbf5ed697c6bf153dcea089883d21550992b135d not found: ID does not exist" Apr 17 08:08:09.851407 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.851387 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc"] Apr 17 08:08:09.852768 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:09.852744 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-247b1-67485f48df-xgskc"] Apr 17 08:08:10.881968 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:08:10.881932 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" path="/var/lib/kubelet/pods/0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e/volumes" Apr 17 08:09:09.474924 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.474872 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67"] Apr 17 08:09:09.475389 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.475374 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" Apr 17 08:09:09.475443 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.475392 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" Apr 17 08:09:09.475479 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.475470 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ae42a07-53f9-4ec5-a950-9a9dfc6ee74e" containerName="model-chainer-raw-247b1" Apr 17 08:09:09.478642 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.478624 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:09.480553 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.480525 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-4b469-kube-rbac-proxy-sar-config\"" Apr 17 08:09:09.480672 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.480569 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-4b469-serving-cert\"" Apr 17 08:09:09.480672 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.480590 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-k26cm\"" Apr 17 08:09:09.481050 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.481037 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:09:09.483841 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.483818 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67"] Apr 17 08:09:09.516054 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.516022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4b469-6697f97df-nxj67\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:09.516054 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.516056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls\") pod \"model-chainer-raw-hpa-4b469-6697f97df-nxj67\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:09.617360 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.617312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4b469-6697f97df-nxj67\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:09.617360 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.617359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls\") pod \"model-chainer-raw-hpa-4b469-6697f97df-nxj67\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:09.617620 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:09:09.617532 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-serving-cert: secret "model-chainer-raw-hpa-4b469-serving-cert" not found Apr 17 08:09:09.617620 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:09:09.617602 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls podName:bd6b2271-6fd3-4e6a-98b1-893daa6faa93 nodeName:}" failed. No retries permitted until 2026-04-17 08:09:10.117586328 +0000 UTC m=+1043.827189819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls") pod "model-chainer-raw-hpa-4b469-6697f97df-nxj67" (UID: "bd6b2271-6fd3-4e6a-98b1-893daa6faa93") : secret "model-chainer-raw-hpa-4b469-serving-cert" not found Apr 17 08:09:09.618072 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:09.618053 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4b469-6697f97df-nxj67\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:10.122003 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:10.121954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls\") pod \"model-chainer-raw-hpa-4b469-6697f97df-nxj67\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:10.124604 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:10.124580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls\") pod \"model-chainer-raw-hpa-4b469-6697f97df-nxj67\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:10.390375 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:10.390339 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:10.508952 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:10.508746 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67"] Apr 17 08:09:10.511541 ip-10-0-142-45 kubenswrapper[2579]: W0417 08:09:10.511514 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6b2271_6fd3_4e6a_98b1_893daa6faa93.slice/crio-cb69b34a33650cc7b7a262d3ce0172e1b691119df06e45582178941355c7134e WatchSource:0}: Error finding container cb69b34a33650cc7b7a262d3ce0172e1b691119df06e45582178941355c7134e: Status 404 returned error can't find the container with id cb69b34a33650cc7b7a262d3ce0172e1b691119df06e45582178941355c7134e Apr 17 08:09:11.011381 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:11.011347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" event={"ID":"bd6b2271-6fd3-4e6a-98b1-893daa6faa93","Type":"ContainerStarted","Data":"01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf"} Apr 17 08:09:11.011381 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:11.011383 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" event={"ID":"bd6b2271-6fd3-4e6a-98b1-893daa6faa93","Type":"ContainerStarted","Data":"cb69b34a33650cc7b7a262d3ce0172e1b691119df06e45582178941355c7134e"} Apr 17 08:09:11.011584 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:11.011409 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:11.025475 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:11.025432 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" podStartSLOduration=2.025417607 podStartE2EDuration="2.025417607s" podCreationTimestamp="2026-04-17 08:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:09:11.024354433 +0000 UTC m=+1044.733957947" watchObservedRunningTime="2026-04-17 08:09:11.025417607 +0000 UTC m=+1044.735021117" Apr 17 08:09:17.020075 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:17.020043 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:19.516800 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:19.516769 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67"] Apr 17 08:09:19.517386 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:19.516982 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" containerID="cri-o://01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf" gracePeriod=30 Apr 17 08:09:22.017875 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:22.017837 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:09:27.018125 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:27.018082 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:09:32.017987 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:32.017901 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:09:32.018323 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:32.018020 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:37.018408 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:37.018368 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:09:42.018515 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:42.018469 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:09:47.018441 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:47.018398 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:09:49.528556 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:09:49.528469 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6b2271_6fd3_4e6a_98b1_893daa6faa93.slice/crio-cb69b34a33650cc7b7a262d3ce0172e1b691119df06e45582178941355c7134e\": RecentStats: unable to find data in memory cache]" Apr 17 08:09:49.658543 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:49.658521 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:49.740036 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:49.740004 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-openshift-service-ca-bundle\") pod \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " Apr 17 08:09:49.740186 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:49.740050 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls\") pod \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\" (UID: \"bd6b2271-6fd3-4e6a-98b1-893daa6faa93\") " Apr 17 08:09:49.740372 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:49.740349 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bd6b2271-6fd3-4e6a-98b1-893daa6faa93" (UID: "bd6b2271-6fd3-4e6a-98b1-893daa6faa93"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:09:49.742290 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:49.742269 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bd6b2271-6fd3-4e6a-98b1-893daa6faa93" (UID: "bd6b2271-6fd3-4e6a-98b1-893daa6faa93"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:09:49.840875 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:49.840799 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-openshift-service-ca-bundle\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 08:09:49.840875 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:49.840830 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd6b2271-6fd3-4e6a-98b1-893daa6faa93-proxy-tls\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 08:09:50.120740 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.120650 2579 generic.go:358] "Generic (PLEG): container finished" podID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerID="01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf" exitCode=0 Apr 17 08:09:50.120740 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.120716 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" Apr 17 08:09:50.120985 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.120747 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" event={"ID":"bd6b2271-6fd3-4e6a-98b1-893daa6faa93","Type":"ContainerDied","Data":"01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf"} Apr 17 08:09:50.120985 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.120793 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67" event={"ID":"bd6b2271-6fd3-4e6a-98b1-893daa6faa93","Type":"ContainerDied","Data":"cb69b34a33650cc7b7a262d3ce0172e1b691119df06e45582178941355c7134e"} Apr 17 08:09:50.120985 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.120812 2579 scope.go:117] "RemoveContainer" containerID="01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf" Apr 17 08:09:50.128780 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.128756 2579 scope.go:117] "RemoveContainer" containerID="01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf" Apr 17 08:09:50.129073 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:09:50.129048 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf\": container with ID starting with 01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf not found: ID does not exist" containerID="01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf" Apr 17 08:09:50.129129 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.129086 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf"} err="failed to get container status \"01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf\": rpc error: code = NotFound desc = could not find container \"01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf\": container with ID starting with 01b7c041bbf1f2c52e6f5a69c453531bfa1b795b8ed757c97fe31e1756d993bf not found: ID does not exist" Apr 17 08:09:50.142839 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.142815 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67"] Apr 17 08:09:50.146350 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.146330 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4b469-6697f97df-nxj67"] Apr 17 08:09:50.882024 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:09:50.881985 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" path="/var/lib/kubelet/pods/bd6b2271-6fd3-4e6a-98b1-893daa6faa93/volumes" Apr 17 08:11:46.832989 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:11:46.832956 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:11:46.834629 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:11:46.834606 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:11:46.835640 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:11:46.835620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:11:46.837352 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:11:46.837333 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:16:46.855409 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:16:46.855379 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:16:46.857445 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:16:46.857423 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:16:46.858286 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:16:46.858268 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:16:46.860217 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:16:46.860186 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:17:52.911355 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.911319 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dfbrg/must-gather-sjmp8"] Apr 17 08:17:52.911804 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.911649 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" Apr 17 08:17:52.911804 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.911660 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" Apr 17 08:17:52.911804 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.911711 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd6b2271-6fd3-4e6a-98b1-893daa6faa93" containerName="model-chainer-raw-hpa-4b469" Apr 17 08:17:52.914821 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.914801 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:17:52.916972 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.916950 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dfbrg\"/\"kube-root-ca.crt\"" Apr 17 08:17:52.916972 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.916974 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dfbrg\"/\"openshift-service-ca.crt\"" Apr 17 08:17:52.917150 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.917112 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dfbrg\"/\"default-dockercfg-5gcl5\"" Apr 17 08:17:52.922435 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:52.922416 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dfbrg/must-gather-sjmp8"] Apr 17 08:17:53.061610 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.061583 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1287ce36-a083-45ae-ba05-a238566f322c-must-gather-output\") pod \"must-gather-sjmp8\" (UID: \"1287ce36-a083-45ae-ba05-a238566f322c\") " pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:17:53.061769 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.061642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj7fj\" (UniqueName: \"kubernetes.io/projected/1287ce36-a083-45ae-ba05-a238566f322c-kube-api-access-rj7fj\") pod \"must-gather-sjmp8\" (UID: \"1287ce36-a083-45ae-ba05-a238566f322c\") " pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:17:53.162604 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.162523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1287ce36-a083-45ae-ba05-a238566f322c-must-gather-output\") pod \"must-gather-sjmp8\" (UID: \"1287ce36-a083-45ae-ba05-a238566f322c\") " pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:17:53.162747 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.162611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rj7fj\" (UniqueName: \"kubernetes.io/projected/1287ce36-a083-45ae-ba05-a238566f322c-kube-api-access-rj7fj\") pod \"must-gather-sjmp8\" (UID: \"1287ce36-a083-45ae-ba05-a238566f322c\") " pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:17:53.162868 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.162846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1287ce36-a083-45ae-ba05-a238566f322c-must-gather-output\") pod \"must-gather-sjmp8\" (UID: \"1287ce36-a083-45ae-ba05-a238566f322c\") " pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:17:53.170378 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.170353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj7fj\" (UniqueName: \"kubernetes.io/projected/1287ce36-a083-45ae-ba05-a238566f322c-kube-api-access-rj7fj\") pod \"must-gather-sjmp8\" (UID: \"1287ce36-a083-45ae-ba05-a238566f322c\") " pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:17:53.240550 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.240522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:17:53.359186 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.359164 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dfbrg/must-gather-sjmp8"] Apr 17 08:17:53.361940 ip-10-0-142-45 kubenswrapper[2579]: W0417 08:17:53.361870 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1287ce36_a083_45ae_ba05_a238566f322c.slice/crio-c193adbdc666364f8ff3349c8659dd7debe6c6a4df56af9103ac9d6ee97c8ce8 WatchSource:0}: Error finding container c193adbdc666364f8ff3349c8659dd7debe6c6a4df56af9103ac9d6ee97c8ce8: Status 404 returned error can't find the container with id c193adbdc666364f8ff3349c8659dd7debe6c6a4df56af9103ac9d6ee97c8ce8 Apr 17 08:17:53.363656 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.363637 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:17:53.529087 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:53.528999 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" event={"ID":"1287ce36-a083-45ae-ba05-a238566f322c","Type":"ContainerStarted","Data":"c193adbdc666364f8ff3349c8659dd7debe6c6a4df56af9103ac9d6ee97c8ce8"} Apr 17 08:17:59.551034 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:59.550994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" event={"ID":"1287ce36-a083-45ae-ba05-a238566f322c","Type":"ContainerStarted","Data":"711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96"} Apr 17 08:17:59.551034 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:59.551035 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" event={"ID":"1287ce36-a083-45ae-ba05-a238566f322c","Type":"ContainerStarted","Data":"aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66"} Apr 17 08:17:59.567350 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:17:59.567300 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" podStartSLOduration=2.446210312 podStartE2EDuration="7.567284667s" podCreationTimestamp="2026-04-17 08:17:52 +0000 UTC" firstStartedPulling="2026-04-17 08:17:53.36381881 +0000 UTC m=+1567.073422315" lastFinishedPulling="2026-04-17 08:17:58.484893176 +0000 UTC m=+1572.194496670" observedRunningTime="2026-04-17 08:17:59.56564398 +0000 UTC m=+1573.275247493" watchObservedRunningTime="2026-04-17 08:17:59.567284667 +0000 UTC m=+1573.276888187" Apr 17 08:18:16.607641 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:16.607600 2579 generic.go:358] "Generic (PLEG): container finished" podID="1287ce36-a083-45ae-ba05-a238566f322c" containerID="aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66" exitCode=0 Apr 17 08:18:16.608054 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:16.607657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" event={"ID":"1287ce36-a083-45ae-ba05-a238566f322c","Type":"ContainerDied","Data":"aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66"} Apr 17 08:18:16.608054 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:16.607955 2579 scope.go:117] "RemoveContainer" containerID="aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66" Apr 17 08:18:16.885129 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:16.885096 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dfbrg_must-gather-sjmp8_1287ce36-a083-45ae-ba05-a238566f322c/gather/0.log" Apr 17 08:18:17.410809 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.410770 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nl5sm/must-gather-8nj4v"] Apr 17 08:18:17.414365 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.414347 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nl5sm/must-gather-8nj4v" Apr 17 08:18:17.416903 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.416859 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nl5sm\"/\"openshift-service-ca.crt\"" Apr 17 08:18:17.417020 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.416953 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nl5sm\"/\"default-dockercfg-r66q7\"" Apr 17 08:18:17.417564 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.417547 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nl5sm\"/\"kube-root-ca.crt\"" Apr 17 08:18:17.423848 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.423825 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nl5sm/must-gather-8nj4v"] Apr 17 08:18:17.483654 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.483618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vf6d\" (UniqueName: \"kubernetes.io/projected/7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba-kube-api-access-2vf6d\") pod \"must-gather-8nj4v\" (UID: \"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba\") " pod="openshift-must-gather-nl5sm/must-gather-8nj4v" Apr 17 08:18:17.483832 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.483663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba-must-gather-output\") pod \"must-gather-8nj4v\" (UID: \"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba\") " pod="openshift-must-gather-nl5sm/must-gather-8nj4v" Apr 17 08:18:17.584953 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.584906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vf6d\" (UniqueName: \"kubernetes.io/projected/7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba-kube-api-access-2vf6d\") pod \"must-gather-8nj4v\" (UID: \"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba\") " pod="openshift-must-gather-nl5sm/must-gather-8nj4v" Apr 17 08:18:17.585144 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.584979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba-must-gather-output\") pod \"must-gather-8nj4v\" (UID: \"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba\") " pod="openshift-must-gather-nl5sm/must-gather-8nj4v" Apr 17 08:18:17.585341 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.585312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba-must-gather-output\") pod \"must-gather-8nj4v\" (UID: \"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba\") " pod="openshift-must-gather-nl5sm/must-gather-8nj4v" Apr 17 08:18:17.593406 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.593384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vf6d\" (UniqueName: \"kubernetes.io/projected/7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba-kube-api-access-2vf6d\") pod \"must-gather-8nj4v\" (UID: \"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba\") " pod="openshift-must-gather-nl5sm/must-gather-8nj4v" Apr 17 08:18:17.724538 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.724438 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nl5sm/must-gather-8nj4v" Apr 17 08:18:17.842170 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:17.842133 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nl5sm/must-gather-8nj4v"] Apr 17 08:18:17.844979 ip-10-0-142-45 kubenswrapper[2579]: W0417 08:18:17.844946 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7de4fcc1_86f9_44cf_bb9f_c40cbb3108ba.slice/crio-363936a6222db478b15a469d308d8306299c644ebf1f62a1b99453d93b806022 WatchSource:0}: Error finding container 363936a6222db478b15a469d308d8306299c644ebf1f62a1b99453d93b806022: Status 404 returned error can't find the container with id 363936a6222db478b15a469d308d8306299c644ebf1f62a1b99453d93b806022 Apr 17 08:18:18.616557 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:18.616525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nl5sm/must-gather-8nj4v" event={"ID":"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba","Type":"ContainerStarted","Data":"363936a6222db478b15a469d308d8306299c644ebf1f62a1b99453d93b806022"} Apr 17 08:18:19.622234 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:19.622194 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nl5sm/must-gather-8nj4v" event={"ID":"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba","Type":"ContainerStarted","Data":"d692c5b1c947828eda93eaf01cd734c1299b0d308babf83ac0fdee32094ab274"} Apr 17 08:18:19.622234 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:19.622238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nl5sm/must-gather-8nj4v" event={"ID":"7de4fcc1-86f9-44cf-bb9f-c40cbb3108ba","Type":"ContainerStarted","Data":"4cb345d9d999cc864f02ed514b9f7f50f5abc3bd9133dfd2a0c5a21661f37f1f"} Apr 17 08:18:19.636281 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:19.636220 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nl5sm/must-gather-8nj4v" podStartSLOduration=1.8645855660000001 podStartE2EDuration="2.636200481s" podCreationTimestamp="2026-04-17 08:18:17 +0000 UTC" firstStartedPulling="2026-04-17 08:18:17.846817645 +0000 UTC m=+1591.556421143" lastFinishedPulling="2026-04-17 08:18:18.618432566 +0000 UTC m=+1592.328036058" observedRunningTime="2026-04-17 08:18:19.635806863 +0000 UTC m=+1593.345410378" watchObservedRunningTime="2026-04-17 08:18:19.636200481 +0000 UTC m=+1593.345803995" Apr 17 08:18:20.020002 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:20.019967 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4zw5q_8d5adbb5-8912-4dd9-9490-bc113967f040/global-pull-secret-syncer/0.log" Apr 17 08:18:20.170435 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:20.170401 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-862qc_954c8165-6465-403b-9b5d-f03c3bcce354/konnectivity-agent/0.log" Apr 17 08:18:20.316860 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:20.316784 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-45.ec2.internal_16712333c9c8322032c37f859cfbedd3/haproxy/0.log" Apr 17 08:18:22.247385 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.247332 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dfbrg/must-gather-sjmp8"] Apr 17 08:18:22.247940 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.247658 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" podUID="1287ce36-a083-45ae-ba05-a238566f322c" containerName="copy" containerID="cri-o://711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96" gracePeriod=2 Apr 17 08:18:22.249455 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.249413 2579 status_manager.go:895] "Failed to get status for pod" podUID="1287ce36-a083-45ae-ba05-a238566f322c" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" err="pods \"must-gather-sjmp8\" is forbidden: User \"system:node:ip-10-0-142-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-dfbrg\": no relationship found between node 'ip-10-0-142-45.ec2.internal' and this object" Apr 17 08:18:22.252472 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.252448 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dfbrg/must-gather-sjmp8"] Apr 17 08:18:22.587227 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.587146 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dfbrg_must-gather-sjmp8_1287ce36-a083-45ae-ba05-a238566f322c/copy/0.log" Apr 17 08:18:22.588208 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.587955 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:18:22.589720 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.589685 2579 status_manager.go:895] "Failed to get status for pod" podUID="1287ce36-a083-45ae-ba05-a238566f322c" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" err="pods \"must-gather-sjmp8\" is forbidden: User \"system:node:ip-10-0-142-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-dfbrg\": no relationship found between node 'ip-10-0-142-45.ec2.internal' and this object" Apr 17 08:18:22.632593 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.631930 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj7fj\" (UniqueName: \"kubernetes.io/projected/1287ce36-a083-45ae-ba05-a238566f322c-kube-api-access-rj7fj\") pod \"1287ce36-a083-45ae-ba05-a238566f322c\" (UID: \"1287ce36-a083-45ae-ba05-a238566f322c\") " Apr 17 08:18:22.632593 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.631989 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1287ce36-a083-45ae-ba05-a238566f322c-must-gather-output\") pod \"1287ce36-a083-45ae-ba05-a238566f322c\" (UID: \"1287ce36-a083-45ae-ba05-a238566f322c\") " Apr 17 08:18:22.634060 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.633556 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1287ce36-a083-45ae-ba05-a238566f322c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1287ce36-a083-45ae-ba05-a238566f322c" (UID: "1287ce36-a083-45ae-ba05-a238566f322c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:18:22.636027 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.635944 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dfbrg_must-gather-sjmp8_1287ce36-a083-45ae-ba05-a238566f322c/copy/0.log" Apr 17 08:18:22.637115 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.636461 2579 generic.go:358] "Generic (PLEG): container finished" podID="1287ce36-a083-45ae-ba05-a238566f322c" containerID="711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96" exitCode=143 Apr 17 08:18:22.637115 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.636582 2579 scope.go:117] "RemoveContainer" containerID="711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96" Apr 17 08:18:22.637115 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.636725 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" Apr 17 08:18:22.653030 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.652117 2579 status_manager.go:895] "Failed to get status for pod" podUID="1287ce36-a083-45ae-ba05-a238566f322c" pod="openshift-must-gather-dfbrg/must-gather-sjmp8" err="pods \"must-gather-sjmp8\" is forbidden: User \"system:node:ip-10-0-142-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-dfbrg\": no relationship found between node 'ip-10-0-142-45.ec2.internal' and this object" Apr 17 08:18:22.653375 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.653300 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1287ce36-a083-45ae-ba05-a238566f322c-kube-api-access-rj7fj" (OuterVolumeSpecName: "kube-api-access-rj7fj") pod "1287ce36-a083-45ae-ba05-a238566f322c" (UID: "1287ce36-a083-45ae-ba05-a238566f322c"). InnerVolumeSpecName "kube-api-access-rj7fj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:18:22.668128 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.668083 2579 scope.go:117] "RemoveContainer" containerID="aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66" Apr 17 08:18:22.694587 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.694544 2579 scope.go:117] "RemoveContainer" containerID="711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96" Apr 17 08:18:22.698188 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:18:22.697995 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96\": container with ID starting with 711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96 not found: ID does not exist" containerID="711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96" Apr 17 08:18:22.698188 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.698047 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96"} err="failed to get container status \"711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96\": rpc error: code = NotFound desc = could not find container \"711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96\": container with ID starting with 711bb12d2cdbc54fd79ed4ae53016b651bd0425df948cd81230edff048a76a96 not found: ID does not exist" Apr 17 08:18:22.698188 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.698077 2579 scope.go:117] "RemoveContainer" containerID="aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66" Apr 17 08:18:22.698435 ip-10-0-142-45 kubenswrapper[2579]: E0417 08:18:22.698410 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66\": container with ID starting with aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66 not found: ID does not exist" containerID="aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66" Apr 17 08:18:22.698497 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.698447 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66"} err="failed to get container status \"aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66\": rpc error: code = NotFound desc = could not find container \"aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66\": container with ID starting with aa5cd0a766a057b7283e63aa4da116cae6190228f1358d264a56c89a3dff9a66 not found: ID does not exist" Apr 17 08:18:22.733681 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.733617 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rj7fj\" (UniqueName: \"kubernetes.io/projected/1287ce36-a083-45ae-ba05-a238566f322c-kube-api-access-rj7fj\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 08:18:22.733681 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.733652 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1287ce36-a083-45ae-ba05-a238566f322c-must-gather-output\") on node \"ip-10-0-142-45.ec2.internal\" DevicePath \"\"" Apr 17 08:18:22.884209 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:22.884178 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1287ce36-a083-45ae-ba05-a238566f322c" path="/var/lib/kubelet/pods/1287ce36-a083-45ae-ba05-a238566f322c/volumes" Apr 17 08:18:23.626099 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:23.626062 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-plmjw_81317276-c9cb-47e7-a1e6-ff72bed6bfc7/cluster-monitoring-operator/0.log" Apr 17 08:18:23.931874 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:23.931844 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hlkqr_889bd38b-dd72-4df0-bede-0c93109664ba/node-exporter/0.log" Apr 17 08:18:23.952766 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:23.952737 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hlkqr_889bd38b-dd72-4df0-bede-0c93109664ba/kube-rbac-proxy/0.log" Apr 17 08:18:23.978909 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:23.978857 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hlkqr_889bd38b-dd72-4df0-bede-0c93109664ba/init-textfile/0.log" Apr 17 08:18:24.007394 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.007354 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zgmwj_827d2ca3-e0b2-47a8-b2e2-124564260c25/kube-rbac-proxy-main/0.log" Apr 17 08:18:24.033020 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.032990 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zgmwj_827d2ca3-e0b2-47a8-b2e2-124564260c25/kube-rbac-proxy-self/0.log" Apr 17 08:18:24.057409 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.057343 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zgmwj_827d2ca3-e0b2-47a8-b2e2-124564260c25/openshift-state-metrics/0.log" Apr 17 08:18:24.096102 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.096062 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_681625ef-d4f2-4499-83c8-2234ec8c6c18/prometheus/0.log" Apr 17 08:18:24.113603 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.113573 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_681625ef-d4f2-4499-83c8-2234ec8c6c18/config-reloader/0.log" Apr 17 08:18:24.134101 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.134073 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_681625ef-d4f2-4499-83c8-2234ec8c6c18/thanos-sidecar/0.log" Apr 17 08:18:24.159465 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.159433 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_681625ef-d4f2-4499-83c8-2234ec8c6c18/kube-rbac-proxy-web/0.log" Apr 17 08:18:24.181202 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.181141 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_681625ef-d4f2-4499-83c8-2234ec8c6c18/kube-rbac-proxy/0.log" Apr 17 08:18:24.204300 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.204214 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_681625ef-d4f2-4499-83c8-2234ec8c6c18/kube-rbac-proxy-thanos/0.log" Apr 17 08:18:24.233037 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.233013 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_681625ef-d4f2-4499-83c8-2234ec8c6c18/init-config-reloader/0.log" Apr 17 08:18:24.274119 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.274076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-nhk85_4e851222-e25b-4bd4-a0e0-f16d686a1ad2/prometheus-operator/0.log" Apr 17 08:18:24.311942 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.311911 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-nhk85_4e851222-e25b-4bd4-a0e0-f16d686a1ad2/kube-rbac-proxy/0.log" Apr 17 08:18:24.387710 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.387680 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-579c77c998-2zhv2_3383c607-6fe7-4824-a6bf-297348c8d409/telemeter-client/0.log" Apr 17 08:18:24.429296 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.429278 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-579c77c998-2zhv2_3383c607-6fe7-4824-a6bf-297348c8d409/reload/0.log" Apr 17 08:18:24.474146 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.474031 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-579c77c998-2zhv2_3383c607-6fe7-4824-a6bf-297348c8d409/kube-rbac-proxy/0.log" Apr 17 08:18:24.521181 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.521141 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754887df57-tjggd_80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4/thanos-query/0.log" Apr 17 08:18:24.566627 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.566570 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754887df57-tjggd_80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4/kube-rbac-proxy-web/0.log" Apr 17 08:18:24.610645 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.610606 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754887df57-tjggd_80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4/kube-rbac-proxy/0.log" Apr 17 08:18:24.657298 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.657264 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754887df57-tjggd_80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4/prom-label-proxy/0.log" Apr 17 08:18:24.697980 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.697943 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754887df57-tjggd_80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4/kube-rbac-proxy-rules/0.log" Apr 17 08:18:24.745864 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:24.745774 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754887df57-tjggd_80c29f9b-89fa-41d1-b4a2-4f1dfcf38af4/kube-rbac-proxy-metrics/0.log" Apr 17 08:18:26.339656 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:26.339624 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/2.log" Apr 17 08:18:26.344141 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:26.344120 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7wts4_e3b4cab1-15d3-4640-89c6-ec91e734f2fd/console-operator/3.log" Apr 17 08:18:26.731367 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:26.731337 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-6d49j_7cd30c29-08d2-4de4-9362-0464cd619d23/download-server/0.log" Apr 17 08:18:27.060621 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.060548 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz"] Apr 17 08:18:27.060877 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.060866 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1287ce36-a083-45ae-ba05-a238566f322c" containerName="gather" Apr 17 08:18:27.060947 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.060906 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1287ce36-a083-45ae-ba05-a238566f322c" containerName="gather" Apr 17 08:18:27.060947 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.060929 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1287ce36-a083-45ae-ba05-a238566f322c" containerName="copy" Apr 17 08:18:27.060947 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.060936 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1287ce36-a083-45ae-ba05-a238566f322c" containerName="copy" Apr 17 08:18:27.061066 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.061016 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1287ce36-a083-45ae-ba05-a238566f322c" containerName="copy" Apr 17 08:18:27.061066 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.061027 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1287ce36-a083-45ae-ba05-a238566f322c" containerName="gather" Apr 17 08:18:27.065583 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.065561 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.072808 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.072759 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz"] Apr 17 08:18:27.131076 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.131048 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-8fnbq_a893b496-3c99-4d44-969c-deb90700402f/volume-data-source-validator/0.log" Apr 17 08:18:27.176152 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.176116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpnlh\" (UniqueName: \"kubernetes.io/projected/c1ac3a63-b8cc-47c6-a001-bf01022f7910-kube-api-access-gpnlh\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.176446 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.176422 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-podres\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.176604 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.176588 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-proc\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.176773 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.176748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-sys\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.176996 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.176977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-lib-modules\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278322 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-podres\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278504 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-podres\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278504 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-proc\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278504 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278424 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-proc\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278504 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278432 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-sys\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278504 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-sys\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278750 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-lib-modules\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278750 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpnlh\" (UniqueName: \"kubernetes.io/projected/c1ac3a63-b8cc-47c6-a001-bf01022f7910-kube-api-access-gpnlh\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.278750 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.278630 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1ac3a63-b8cc-47c6-a001-bf01022f7910-lib-modules\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.286342 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.286308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpnlh\" (UniqueName: \"kubernetes.io/projected/c1ac3a63-b8cc-47c6-a001-bf01022f7910-kube-api-access-gpnlh\") pod \"perf-node-gather-daemonset-s28qz\" (UID: \"c1ac3a63-b8cc-47c6-a001-bf01022f7910\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.379675 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.379624 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.528737 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.528611 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz"] Apr 17 08:18:27.532192 ip-10-0-142-45 kubenswrapper[2579]: W0417 08:18:27.532161 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc1ac3a63_b8cc_47c6_a001_bf01022f7910.slice/crio-41d6fd81ea5d5846ee09635e28cbe36fba69c73a1db528202b7af45caef075b4 WatchSource:0}: Error finding container 41d6fd81ea5d5846ee09635e28cbe36fba69c73a1db528202b7af45caef075b4: Status 404 returned error can't find the container with id 41d6fd81ea5d5846ee09635e28cbe36fba69c73a1db528202b7af45caef075b4 Apr 17 08:18:27.657646 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.657552 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" event={"ID":"c1ac3a63-b8cc-47c6-a001-bf01022f7910","Type":"ContainerStarted","Data":"a561a1888503771054519b091814ac0970bd52b0935332860ffb7de2047501ac"} Apr 17 08:18:27.657646 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.657602 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" event={"ID":"c1ac3a63-b8cc-47c6-a001-bf01022f7910","Type":"ContainerStarted","Data":"41d6fd81ea5d5846ee09635e28cbe36fba69c73a1db528202b7af45caef075b4"} Apr 17 08:18:27.657646 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.657638 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:27.671661 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.671585 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" podStartSLOduration=0.671566513 podStartE2EDuration="671.566513ms" podCreationTimestamp="2026-04-17 08:18:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:18:27.671305308 +0000 UTC m=+1601.380908824" watchObservedRunningTime="2026-04-17 08:18:27.671566513 +0000 UTC m=+1601.381170027" Apr 17 08:18:27.779952 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.779906 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lrw8v_61a4efa4-9ac5-47c3-ba8a-6fa191936c56/dns/0.log" Apr 17 08:18:27.800311 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.800215 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lrw8v_61a4efa4-9ac5-47c3-ba8a-6fa191936c56/kube-rbac-proxy/0.log" Apr 17 08:18:27.909103 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:27.909022 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-62tj8_e5343931-acc9-4a96-81bd-fc6bbad4d9be/dns-node-resolver/0.log" Apr 17 08:18:28.443359 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:28.443307 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mvk82_c3da12b2-c527-43e5-96c7-37b56fb6b22d/node-ca/0.log" Apr 17 08:18:29.446357 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:29.446327 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bh4vm_d4301eb7-7ba7-42b5-961e-ef10f1fe7955/serve-healthcheck-canary/0.log" Apr 17 08:18:29.982531 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:29.982502 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z8d29_7cacbfd9-e67b-4bff-ad81-e39e7850f0b0/kube-rbac-proxy/0.log" Apr 17 08:18:30.002173 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:30.002115 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z8d29_7cacbfd9-e67b-4bff-ad81-e39e7850f0b0/exporter/0.log" Apr 17 08:18:30.022688 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:30.022663 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z8d29_7cacbfd9-e67b-4bff-ad81-e39e7850f0b0/extractor/0.log" Apr 17 08:18:32.133267 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:32.133237 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-czjmb_1e5521e4-05bf-4dda-94fb-09c1400e8ffd/s3-init/0.log" Apr 17 08:18:33.671940 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:33.671910 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-s28qz" Apr 17 08:18:35.785619 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:35.785549 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kbg5v_b7e592dd-647e-4a90-b403-e187b9b2475f/migrator/0.log" Apr 17 08:18:35.808338 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:35.808312 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kbg5v_b7e592dd-647e-4a90-b403-e187b9b2475f/graceful-termination/0.log" Apr 17 08:18:37.449935 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.449878 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wg6r2_83dbb733-5a1c-4565-b720-5b5bf99f74b8/kube-multus-additional-cni-plugins/0.log" Apr 17 08:18:37.472044 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.472016 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wg6r2_83dbb733-5a1c-4565-b720-5b5bf99f74b8/egress-router-binary-copy/0.log" Apr 17 08:18:37.493756 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.493730 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wg6r2_83dbb733-5a1c-4565-b720-5b5bf99f74b8/cni-plugins/0.log" Apr 17 08:18:37.513740 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.513711 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wg6r2_83dbb733-5a1c-4565-b720-5b5bf99f74b8/bond-cni-plugin/0.log" Apr 17 08:18:37.533693 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.533664 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wg6r2_83dbb733-5a1c-4565-b720-5b5bf99f74b8/routeoverride-cni/0.log" Apr 17 08:18:37.554476 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.554451 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wg6r2_83dbb733-5a1c-4565-b720-5b5bf99f74b8/whereabouts-cni-bincopy/0.log" Apr 17 08:18:37.576170 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.576145 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wg6r2_83dbb733-5a1c-4565-b720-5b5bf99f74b8/whereabouts-cni/0.log" Apr 17 08:18:37.638458 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.638430 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dds6t_4436d074-134b-4347-bddd-5a274dc24549/kube-multus/0.log" Apr 17 08:18:37.787154 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.787063 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qq9zp_2b638776-5000-47ee-92c9-8ba7655f560c/network-metrics-daemon/0.log" Apr 17 08:18:37.811198 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:37.811158 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qq9zp_2b638776-5000-47ee-92c9-8ba7655f560c/kube-rbac-proxy/0.log" Apr 17 08:18:38.591233 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.591152 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-controller/0.log" Apr 17 08:18:38.609847 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.609816 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/0.log" Apr 17 08:18:38.617812 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.617782 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovn-acl-logging/1.log" Apr 17 08:18:38.638623 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.638602 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/kube-rbac-proxy-node/0.log" Apr 17 08:18:38.663835 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.663804 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:18:38.680267 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.680248 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/northd/0.log" Apr 17 08:18:38.700477 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.700455 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/nbdb/0.log" Apr 17 08:18:38.721264 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.721239 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/sbdb/0.log" Apr 17 08:18:38.824528 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:38.824480 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ldxs6_d966014f-f7ed-4082-afdb-6e81d3b82816/ovnkube-controller/0.log" Apr 17 08:18:40.345453 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:40.345423 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4d2hp_93dafc8e-891d-4f40-b29c-0e82ac63515a/check-endpoints/0.log" Apr 17 08:18:40.366937 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:40.366909 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ml85w_45bb327f-877b-4da6-8c2d-e760eb62707d/network-check-target-container/0.log" Apr 17 08:18:41.336470 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:41.336442 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7m6g5_2340e8a7-5618-413e-a2cf-f5ff4f985ad1/iptables-alerter/0.log" Apr 17 08:18:41.940662 ip-10-0-142-45 kubenswrapper[2579]: I0417 08:18:41.940635 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-fnbbr_9f36b281-ef72-4eb5-963a-ef189f7f1559/tuned/0.log"