Apr 16 18:13:54.524567 ip-10-0-128-68 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:13:54.524579 ip-10-0-128-68 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:13:54.524589 ip-10-0-128-68 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:13:54.524917 ip-10-0-128-68 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:14:04.720793 ip-10-0-128-68 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:14:04.720813 ip-10-0-128-68 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 216a6d6f97db4c8f9ba0d5b317fdf487 -- Apr 16 18:16:28.791500 ip-10-0-128-68 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:16:29.263568 ip-10-0-128-68 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:16:29.263568 ip-10-0-128-68 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:16:29.263568 ip-10-0-128-68 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:16:29.263568 ip-10-0-128-68 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:16:29.263568 ip-10-0-128-68 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:16:29.265351 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.265205 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:16:29.267636 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267620 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:29.267636 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267637 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267640 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267643 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267646 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267649 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267652 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267655 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267658 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267660 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267663 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267668 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267672 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267675 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267678 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267681 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267684 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267687 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267690 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267693 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:29.267704 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267695 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267698 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267700 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267703 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267706 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267712 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267715 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267718 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267720 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267723 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267726 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267729 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267731 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267733 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267736 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267739 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267741 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267744 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267746 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267749 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:29.268147 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267752 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267754 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267757 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267760 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267762 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267765 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267775 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267778 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267780 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267783 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267787 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267791 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267794 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267798 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267802 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267805 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267808 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267811 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267813 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:29.268652 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267816 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267818 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267821 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267825 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267827 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267830 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267832 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267835 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267837 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267840 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267842 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267845 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267847 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267850 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267852 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267855 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267858 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267862 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267865 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267867 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:29.269116 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267870 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267872 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267875 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267877 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267880 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267882 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.267885 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268302 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268307 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268310 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268313 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268316 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268318 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268321 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268324 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268326 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268329 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268332 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268334 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268337 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:29.269744 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268339 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268342 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268344 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268347 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268349 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268352 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268354 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268357 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268360 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268363 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268366 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268369 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268371 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268374 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268376 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268379 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268381 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268384 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268386 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268390 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268393 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:29.270252 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268395 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268398 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268401 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268403 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268406 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268408 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268411 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268413 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268416 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268418 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268421 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268423 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268426 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268429 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268431 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268434 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268436 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268439 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268442 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268445 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:29.270768 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268448 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268450 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268453 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268455 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268458 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268462 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268466 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268469 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268472 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268474 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268478 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268481 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268484 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268486 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268488 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268491 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268494 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268497 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:29.271275 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268499 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268502 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268506 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268509 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268511 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268514 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268517 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268519 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268522 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268525 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268528 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268530 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268533 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.268535 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268611 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268624 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268630 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268635 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268640 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268643 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268648 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:16:29.271712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268652 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268657 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268660 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268664 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268668 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268671 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268674 2562 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268677 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268680 2562 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268683 2562 flags.go:64] FLAG: --cloud-config="" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268686 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268689 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268694 2562 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268697 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268700 2562 flags.go:64] FLAG: --config-dir="" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268703 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268707 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268711 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268713 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268717 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268720 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268723 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268726 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268729 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268732 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:16:29.272240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268736 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268740 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268743 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268746 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268753 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268757 2562 flags.go:64] FLAG: --enable-server="true" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.268760 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270176 2562 flags.go:64] FLAG: --event-burst="100" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270181 2562 flags.go:64] FLAG: --event-qps="50" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270184 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270200 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270204 2562 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270208 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270211 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270214 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270217 2562 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270220 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270223 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270226 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270229 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270233 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270236 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270238 2562 flags.go:64] FLAG: --feature-gates="" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270243 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270246 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:16:29.272842 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270249 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270252 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270256 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270259 2562 flags.go:64] FLAG: --help="false" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270262 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270266 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270269 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270272 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270276 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270279 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270283 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270287 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270290 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270293 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270296 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270300 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270302 2562 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270305 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270308 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270311 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270314 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270317 2562 flags.go:64] FLAG: --lock-file="" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270320 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270323 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:16:29.273490 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270326 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270331 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270334 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270337 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270340 2562 flags.go:64] FLAG: --logging-format="text" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270343 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270346 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270349 2562 flags.go:64] FLAG: --manifest-url="" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270352 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270357 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270361 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270365 2562 flags.go:64] FLAG: --max-pods="110" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270368 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270371 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270374 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270377 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270380 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270383 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270386 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270396 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270399 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270402 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270405 2562 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:16:29.274060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270408 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270414 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270417 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270420 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270423 2562 flags.go:64] FLAG: --port="10250" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270427 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270430 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0457b1a45f569c39e" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270433 2562 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270436 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270439 2562 flags.go:64] FLAG: --register-node="true" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270441 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270445 2562 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270448 2562 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270451 2562 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270454 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270457 2562 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270461 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270464 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270467 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270470 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270473 2562 flags.go:64] FLAG: --runonce="false" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270476 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270479 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270482 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270485 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270488 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:16:29.274695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270492 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270495 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270498 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270504 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270508 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270511 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270514 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270517 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270520 2562 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270523 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270528 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270531 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270534 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270541 2562 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270544 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270547 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270550 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270553 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270557 2562 flags.go:64] FLAG: --v="2" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270561 2562 flags.go:64] FLAG: --version="false" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270565 2562 flags.go:64] FLAG: --vmodule="" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270569 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270573 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270694 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:29.275340 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270698 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270701 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270704 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270707 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270710 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270712 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270715 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270718 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270720 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270723 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270726 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270728 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270732 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270734 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270737 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270739 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270742 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270744 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270747 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270750 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:29.275916 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270752 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270755 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270758 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270760 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270764 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270768 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270771 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270774 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270777 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270788 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270791 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270794 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270796 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270799 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270801 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270804 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270806 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270809 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270811 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270814 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:29.276458 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270816 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270819 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270821 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270826 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270828 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270832 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270835 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270838 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270840 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270843 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270845 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270848 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270851 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270854 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270856 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270859 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270861 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270864 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270867 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270869 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:29.277321 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270872 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270875 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270878 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270880 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270883 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270886 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270888 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270891 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270893 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270896 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270898 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270901 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270903 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270906 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270908 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270912 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270915 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270920 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270924 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:29.278174 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270927 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270930 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270933 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270936 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270939 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.270942 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.270947 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.278501 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.278522 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278598 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278608 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278616 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278621 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278626 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278631 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:29.279028 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278636 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278640 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278644 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278649 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278653 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278658 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278662 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278667 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278671 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278675 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278679 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278683 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278688 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278693 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278697 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278702 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278706 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278710 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278715 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278719 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:29.279758 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278723 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278727 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278731 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278736 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278740 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278752 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278757 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278761 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278765 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278770 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278774 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278778 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278793 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278798 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278803 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278806 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278810 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278815 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278819 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278823 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:29.280299 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278827 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278832 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278836 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278840 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278844 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278849 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278853 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278857 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278864 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278870 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278875 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278880 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278885 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278889 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278893 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278897 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278902 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278906 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278912 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:29.280876 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278916 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278921 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278925 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278929 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278933 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278937 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278941 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278945 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278949 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278953 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278958 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278962 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278966 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278970 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278974 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278978 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278982 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278986 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278991 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.278995 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:29.281582 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279000 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.279008 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279173 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279183 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279205 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279211 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279215 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279219 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279223 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279227 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279232 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279236 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279241 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279245 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279249 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:29.282435 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279254 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279260 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279267 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279272 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279275 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279280 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279284 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279288 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279293 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279297 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279301 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279305 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279309 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279313 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279318 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279322 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279326 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279330 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279334 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279339 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:29.282909 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279343 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279348 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279353 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279357 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279361 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279365 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279369 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279375 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279379 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279383 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279388 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279392 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279397 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279401 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279406 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279411 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279415 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279419 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279423 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279428 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:29.283551 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279432 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279436 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279440 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279445 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279449 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279453 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279457 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279462 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279466 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279470 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279474 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279508 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279516 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279520 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279525 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279530 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279534 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279538 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279543 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:29.284073 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279547 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279551 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279555 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279559 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279564 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279569 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279573 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279577 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279581 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279585 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279592 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279597 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279601 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:29.279605 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.279613 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:16:29.284612 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.280515 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:16:29.285039 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.283171 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:16:29.285039 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.284364 2562 server.go:1019] "Starting client certificate rotation" Apr 16 18:16:29.285039 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.284462 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:16:29.285911 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.285897 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:16:29.310438 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.310409 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:16:29.314812 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.314787 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:16:29.335801 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.335772 2562 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:16:29.341230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.341210 2562 log.go:25] "Validated CRI v1 image API" Apr 16 18:16:29.342580 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.342564 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:16:29.342942 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.342925 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:16:29.347251 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.347229 2562 fs.go:135] Filesystem UUIDs: map[4d523c33-1138-43ae-a17f-8e46dfd022b1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8bd5de7a-0d60-4070-b0a7-af3820fbb7d9:/dev/nvme0n1p3] Apr 16 18:16:29.347316 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.347251 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:16:29.353330 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.353210 2562 manager.go:217] Machine: {Timestamp:2026-04-16 18:16:29.351380333 +0000 UTC m=+0.436829715 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100093 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20192727939f9ad197c2d14a89b6a1 SystemUUID:ec201927-2793-9f9a-d197-c2d14a89b6a1 BootID:216a6d6f-97db-4c8f-9ba0-d5b317fdf487 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2d:f8:49:c7:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2d:f8:49:c7:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d2:95:3e:ea:df:c3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:16:29.354149 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.354136 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:16:29.354262 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.354249 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:16:29.355277 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.355245 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:16:29.355428 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.355280 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-68.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:16:29.355475 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.355438 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:16:29.355475 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.355448 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:16:29.355475 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.355461 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:16:29.355475 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.355475 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:16:29.357278 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.357265 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:16:29.357393 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.357384 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:16:29.359871 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.359860 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:16:29.359922 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.359876 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:16:29.359922 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.359889 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:16:29.359922 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.359900 2562 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:16:29.359922 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.359909 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:16:29.361028 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.361010 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:16:29.361115 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.361037 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:16:29.365881 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.365863 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:16:29.367794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.367779 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:16:29.368346 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.368331 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pr7jc" Apr 16 18:16:29.369149 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369137 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:16:29.369213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369155 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:16:29.369213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369162 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:16:29.369213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369169 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:16:29.369213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369176 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:16:29.369213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369197 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:16:29.369213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369207 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:16:29.369213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369212 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:16:29.369398 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369219 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:16:29.369398 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369226 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:16:29.369398 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369242 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:16:29.369398 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.369252 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:16:29.370157 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.370147 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:16:29.370202 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.370159 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:16:29.372865 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.372831 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:16:29.373000 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.372980 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-68.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:16:29.373073 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.373056 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-68.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:16:29.374733 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.374717 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:16:29.374815 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.374806 2562 server.go:1295] "Started kubelet" Apr 16 18:16:29.374950 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.374898 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:16:29.374950 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.374908 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:16:29.375055 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.374968 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:16:29.375659 ip-10-0-128-68 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:16:29.376779 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.376762 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pr7jc" Apr 16 18:16:29.377657 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.377638 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:16:29.378667 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.378640 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:16:29.383814 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.383792 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:16:29.384709 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.384694 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:16:29.385244 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.385223 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:16:29.386009 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.385987 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:16:29.386009 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386009 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:16:29.386133 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386099 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:16:29.386178 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386153 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:16:29.386178 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386162 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:16:29.386289 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.386217 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:29.386393 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386376 2562 factory.go:55] Registering systemd factory Apr 16 18:16:29.386449 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386395 2562 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:16:29.386616 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386603 2562 factory.go:153] Registering CRI-O factory Apr 16 18:16:29.386616 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386618 2562 factory.go:223] Registration of the crio container factory successfully Apr 16 18:16:29.386724 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386668 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:16:29.386724 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386686 2562 factory.go:103] Registering Raw factory Apr 16 18:16:29.386724 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.386697 2562 manager.go:1196] Started watching for new ooms in manager Apr 16 18:16:29.387089 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.387075 2562 manager.go:319] Starting recovery of all containers Apr 16 18:16:29.388319 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.388303 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:29.391057 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.391029 2562 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-68.ec2.internal\" not found" node="ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.396780 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.396737 2562 manager.go:324] Recovery completed Apr 16 18:16:29.398995 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.398968 2562 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 18:16:29.402018 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.402006 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:29.404439 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.404313 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:29.404500 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.404453 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:29.404500 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.404463 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:29.404955 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.404942 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:16:29.404955 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.404954 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:16:29.405021 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.404971 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:16:29.406960 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.406949 2562 policy_none.go:49] "None policy: Start" Apr 16 18:16:29.407003 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.406964 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:16:29.407003 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.406974 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:16:29.442897 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.442879 2562 manager.go:341] "Starting Device Plugin manager" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.442920 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.442933 2562 server.go:85] "Starting device plugin registration server" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.443255 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.443267 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.443347 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.443440 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.443449 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.444024 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:16:29.457290 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.444067 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:29.517634 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.517558 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:16:29.518713 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.518694 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:16:29.518796 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.518733 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:16:29.518796 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.518763 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:16:29.518796 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.518774 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:16:29.518943 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.518826 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:16:29.521465 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.521445 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:29.543385 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.543352 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:29.544356 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.544337 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:29.544443 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.544369 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:29.544443 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.544384 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:29.544443 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.544411 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.550994 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.550975 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.551042 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.551000 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-68.ec2.internal\": node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:29.572052 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.572027 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:29.619488 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.619462 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal"] Apr 16 18:16:29.619559 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.619543 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:29.621131 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.621114 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:29.621229 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.621145 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:29.621229 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.621154 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:29.622308 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.622296 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:29.622444 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.622430 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.622477 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.622459 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:29.623133 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.623119 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:29.623218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.623143 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:29.623218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.623153 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:29.623218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.623125 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:29.623218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.623207 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:29.623396 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.623224 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:29.624878 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.624862 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.624935 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.624897 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:29.629416 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.629401 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:29.629493 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.629427 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:29.629493 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.629439 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:29.637745 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.637727 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-68.ec2.internal\" not found" node="ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.643749 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.643734 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-68.ec2.internal\" not found" node="ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.672755 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.672725 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:29.687601 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.687577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96464c8f8d874d5fa7f0601b1e26dfe1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal\" (UID: \"96464c8f8d874d5fa7f0601b1e26dfe1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.687703 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.687607 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69d4f0efd48f93f6cc380a4943e78ab2-config\") pod \"kube-apiserver-proxy-ip-10-0-128-68.ec2.internal\" (UID: \"69d4f0efd48f93f6cc380a4943e78ab2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.687703 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.687629 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96464c8f8d874d5fa7f0601b1e26dfe1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal\" (UID: \"96464c8f8d874d5fa7f0601b1e26dfe1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.773031 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.772968 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:29.788496 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.788455 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96464c8f8d874d5fa7f0601b1e26dfe1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal\" (UID: \"96464c8f8d874d5fa7f0601b1e26dfe1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.788616 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.788501 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96464c8f8d874d5fa7f0601b1e26dfe1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal\" (UID: \"96464c8f8d874d5fa7f0601b1e26dfe1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.788616 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.788527 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69d4f0efd48f93f6cc380a4943e78ab2-config\") pod \"kube-apiserver-proxy-ip-10-0-128-68.ec2.internal\" (UID: \"69d4f0efd48f93f6cc380a4943e78ab2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.788616 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.788555 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96464c8f8d874d5fa7f0601b1e26dfe1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal\" (UID: \"96464c8f8d874d5fa7f0601b1e26dfe1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.788616 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.788557 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69d4f0efd48f93f6cc380a4943e78ab2-config\") pod \"kube-apiserver-proxy-ip-10-0-128-68.ec2.internal\" (UID: \"69d4f0efd48f93f6cc380a4943e78ab2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.788616 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.788558 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96464c8f8d874d5fa7f0601b1e26dfe1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal\" (UID: \"96464c8f8d874d5fa7f0601b1e26dfe1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.873915 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.873877 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:29.941417 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.941385 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.946454 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:29.946432 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" Apr 16 18:16:29.974644 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:29.974610 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:30.075248 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.075153 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:30.175624 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.175592 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-68.ec2.internal\" not found" Apr 16 18:16:30.221481 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.221452 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:30.284240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.284213 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:16:30.284691 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.284333 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:16:30.284691 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.284372 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:16:30.284691 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.284375 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:16:30.286364 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.286345 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" Apr 16 18:16:30.309670 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.309647 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:16:30.311273 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.311257 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" Apr 16 18:16:30.323501 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.323479 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:16:30.360705 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.360628 2562 apiserver.go:52] "Watching apiserver" Apr 16 18:16:30.370648 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.370624 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:16:30.372823 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.372693 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8spcs","openshift-image-registry/node-ca-5m9b6","openshift-multus/multus-additional-cni-plugins-87852","openshift-multus/network-metrics-daemon-4vgjf","openshift-network-diagnostics/network-check-target-7r5l5","openshift-ovn-kubernetes/ovnkube-node-tttq4","kube-system/konnectivity-agent-pbnvm","kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal","openshift-multus/multus-rbhdn","openshift-network-operator/iptables-alerter-tl27p","openshift-cluster-node-tuning-operator/tuned-zww8n"] Apr 16 18:16:30.375571 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.375546 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:30.375668 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.375631 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.377636 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.377613 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.378177 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378157 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:16:30.378177 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378165 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:16:30.378362 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378179 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wzzrc\"" Apr 16 18:16:30.378362 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378179 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:16:30.378578 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378565 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-w7mkd\"" Apr 16 18:16:30.378635 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378582 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:16:30.378635 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378584 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:30.378733 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378688 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:30.378733 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.378693 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:30.378827 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.378745 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:30.378883 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.378870 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:16:30.379754 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.379738 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:16:30.379912 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.379863 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:16:30.379912 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.379873 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:16:30.380047 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.379950 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.380047 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.380002 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:16:30.380047 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.380036 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kgn4n\"" Apr 16 18:16:30.380212 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.380149 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:16:30.380292 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.380262 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:11:29 +0000 UTC" deadline="2027-10-03 10:59:13.785271334 +0000 UTC" Apr 16 18:16:30.380292 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.380289 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12832h42m43.404985053s" Apr 16 18:16:30.380939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.380926 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.381959 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.381943 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.382094 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.382046 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:16:30.382180 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.382143 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:16:30.382364 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.382348 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:16:30.382470 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.382454 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:16:30.382742 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.382728 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hvk9x\"" Apr 16 18:16:30.382820 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.382791 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:16:30.382820 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.382796 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:16:30.382917 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.382890 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:16:30.383057 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.383042 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2xs7s\"" Apr 16 18:16:30.383123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.383069 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.383123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.383086 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:16:30.384067 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.384054 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:16:30.384111 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.384090 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:16:30.384267 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.384252 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.384751 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.384735 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zdnjn\"" Apr 16 18:16:30.384869 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.384781 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:16:30.384937 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.384910 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:16:30.385343 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.385329 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:16:30.385458 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.385427 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.385759 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.385745 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qfbmz\"" Apr 16 18:16:30.386344 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.386307 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hnjqb\"" Apr 16 18:16:30.386485 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.386473 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:16:30.386773 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.386755 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:16:30.386965 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.386932 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:16:30.388118 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.388096 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:16:30.388357 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.388341 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:16:30.388565 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.388551 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:16:30.388731 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.388574 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qw4f9\"" Apr 16 18:16:30.391928 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.391906 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l59tb\" (UniqueName: \"kubernetes.io/projected/d024a606-d155-4b9c-9936-eff2f2e2603c-kube-api-access-l59tb\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.392003 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.391944 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysctl-d\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.392003 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.391969 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b023a24-7ef2-470d-8cd5-90366b171323-ovn-node-metrics-cert\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392097 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392008 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-system-cni-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392097 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392033 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-cnibin\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392097 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392054 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-cni-bin\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392255 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392094 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-hostroot\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392255 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392120 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-systemd\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.392255 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392135 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-var-lib-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392255 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392160 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-socket-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.392255 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392180 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-tuned\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.392255 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392227 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392255 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392249 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-run-ovn-kubernetes\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392266 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-cni-multus\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392287 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d37887a7-2697-430c-834c-76614ddbb9b9-multus-daemon-config\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392306 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysctl-conf\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392322 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392340 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-cni-bin\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392358 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-conf-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392380 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392394 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-env-overrides\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392431 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-lib-modules\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392454 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-cni-binary-copy\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392471 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-run-netns\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392487 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-kubelet\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392524 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ct28\" (UniqueName: \"kubernetes.io/projected/0f27d004-e5fc-4560-8699-ce203c2bf77e-kube-api-access-5ct28\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392572 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-host-slash\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392591 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-socket-dir-parent\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392605 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-system-cni-dir\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392620 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczbz\" (UniqueName: \"kubernetes.io/projected/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-kube-api-access-bczbz\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392646 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sfrv\" (UniqueName: \"kubernetes.io/projected/d37887a7-2697-430c-834c-76614ddbb9b9-kube-api-access-5sfrv\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392678 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-ovnkube-config\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392700 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392722 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-systemd-units\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392746 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-node-log\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392767 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392788 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-iptables-alerter-script\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392811 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-ovnkube-script-lib\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392838 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-os-release\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392886 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392916 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysconfig\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.392941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392939 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-sys\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392963 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-var-lib-kubelet\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.392981 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64568791-94bd-49ea-adf9-c39f5c4c8f08-tmp\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393000 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-registration-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393029 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-slash\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393047 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-systemd\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393062 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-ovn\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393095 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-cni-netd\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393113 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d37887a7-2697-430c-834c-76614ddbb9b9-cni-binary-copy\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393131 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-cnibin\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393168 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-kubernetes\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393212 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-host\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393234 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-device-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393252 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-sys-fs\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393267 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-os-release\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393307 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-run\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393344 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2gbb\" (UniqueName: \"kubernetes.io/projected/64568791-94bd-49ea-adf9-c39f5c4c8f08-kube-api-access-k2gbb\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.393560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393370 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ac43e44-6bc5-4678-9022-029aed19a8c1-hosts-file\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393391 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-netns\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393414 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f7e2ea0-df8c-485d-a95a-e622c53fab2d-konnectivity-ca\") pod \"konnectivity-agent-pbnvm\" (UID: \"5f7e2ea0-df8c-485d-a95a-e622c53fab2d\") " pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393436 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d4f7966d-78bf-4cbb-a764-b066fe69e484-host\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393459 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-etc-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393482 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z84p8\" (UniqueName: \"kubernetes.io/projected/7b023a24-7ef2-470d-8cd5-90366b171323-kube-api-access-z84p8\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393507 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ac43e44-6bc5-4678-9022-029aed19a8c1-tmp-dir\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393529 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-etc-selinux\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393556 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393578 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f7e2ea0-df8c-485d-a95a-e622c53fab2d-agent-certs\") pod \"konnectivity-agent-pbnvm\" (UID: \"5f7e2ea0-df8c-485d-a95a-e622c53fab2d\") " pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393600 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d4f7966d-78bf-4cbb-a764-b066fe69e484-serviceca\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393623 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9tgx\" (UniqueName: \"kubernetes.io/projected/d4f7966d-78bf-4cbb-a764-b066fe69e484-kube-api-access-s9tgx\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393646 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76g5\" (UniqueName: \"kubernetes.io/projected/1ac43e44-6bc5-4678-9022-029aed19a8c1-kube-api-access-w76g5\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393677 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xdr\" (UniqueName: \"kubernetes.io/projected/7682219f-20c8-40ee-a84d-c68d79df1dd8-kube-api-access-88xdr\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393699 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-cni-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393714 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-k8s-cni-cncf-io\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.394217 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393727 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-multus-certs\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.395210 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393749 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.395210 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393777 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-modprobe-d\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.395210 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393799 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-etc-kubernetes\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.395210 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393822 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-kubelet\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.395210 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.393844 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-log-socket\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.398077 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.398060 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:16:30.425010 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.424989 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xcbvh" Apr 16 18:16:30.437023 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.437001 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xcbvh" Apr 16 18:16:30.494046 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494026 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-slash\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494050 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-systemd\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494067 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-ovn\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494081 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-cni-netd\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494102 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d37887a7-2697-430c-834c-76614ddbb9b9-cni-binary-copy\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494135 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-slash\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494143 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-cnibin\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494138 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-systemd\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494151 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-ovn\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494166 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-cni-netd\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494173 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-kubernetes\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.494230 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494232 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-kubernetes\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494231 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-host\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494291 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-cnibin\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494301 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-device-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494312 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-host\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494343 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-device-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494343 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-sys-fs\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494370 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-os-release\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494386 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-run\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494389 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-sys-fs\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494427 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-run\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494426 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2gbb\" (UniqueName: \"kubernetes.io/projected/64568791-94bd-49ea-adf9-c39f5c4c8f08-kube-api-access-k2gbb\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494451 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-os-release\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494463 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ac43e44-6bc5-4678-9022-029aed19a8c1-hosts-file\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494488 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-netns\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494511 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f7e2ea0-df8c-485d-a95a-e622c53fab2d-konnectivity-ca\") pod \"konnectivity-agent-pbnvm\" (UID: \"5f7e2ea0-df8c-485d-a95a-e622c53fab2d\") " pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494534 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d4f7966d-78bf-4cbb-a764-b066fe69e484-host\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494557 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-etc-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.494695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494561 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-netns\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494737 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ac43e44-6bc5-4678-9022-029aed19a8c1-hosts-file\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494869 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-etc-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.494583 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z84p8\" (UniqueName: \"kubernetes.io/projected/7b023a24-7ef2-470d-8cd5-90366b171323-kube-api-access-z84p8\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495136 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ac43e44-6bc5-4678-9022-029aed19a8c1-tmp-dir\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495173 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-etc-selinux\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495220 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495268 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f7e2ea0-df8c-485d-a95a-e622c53fab2d-agent-certs\") pod \"konnectivity-agent-pbnvm\" (UID: \"5f7e2ea0-df8c-485d-a95a-e622c53fab2d\") " pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495304 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d4f7966d-78bf-4cbb-a764-b066fe69e484-serviceca\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495337 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9tgx\" (UniqueName: \"kubernetes.io/projected/d4f7966d-78bf-4cbb-a764-b066fe69e484-kube-api-access-s9tgx\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495369 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w76g5\" (UniqueName: \"kubernetes.io/projected/1ac43e44-6bc5-4678-9022-029aed19a8c1-kube-api-access-w76g5\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495404 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88xdr\" (UniqueName: \"kubernetes.io/projected/7682219f-20c8-40ee-a84d-c68d79df1dd8-kube-api-access-88xdr\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495430 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-cni-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495461 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-k8s-cni-cncf-io\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.495487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495491 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-multus-certs\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495525 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495563 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-modprobe-d\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495585 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ac43e44-6bc5-4678-9022-029aed19a8c1-tmp-dir\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495595 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-etc-kubernetes\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495627 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-kubelet\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495700 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-kubelet\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495707 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d4f7966d-78bf-4cbb-a764-b066fe69e484-host\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495736 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-log-socket\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495798 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-multus-certs\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495829 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f7e2ea0-df8c-485d-a95a-e622c53fab2d-konnectivity-ca\") pod \"konnectivity-agent-pbnvm\" (UID: \"5f7e2ea0-df8c-485d-a95a-e622c53fab2d\") " pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495892 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-cni-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495907 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-etc-selinux\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495996 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-modprobe-d\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496044 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-etc-kubernetes\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496113 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-run-k8s-cni-cncf-io\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496245 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496255 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d37887a7-2697-430c-834c-76614ddbb9b9-cni-binary-copy\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.495696 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-log-socket\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496432 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l59tb\" (UniqueName: \"kubernetes.io/projected/d024a606-d155-4b9c-9936-eff2f2e2603c-kube-api-access-l59tb\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496487 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysctl-d\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496520 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b023a24-7ef2-470d-8cd5-90366b171323-ovn-node-metrics-cert\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496554 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-system-cni-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496586 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-cnibin\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496617 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496622 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-cni-bin\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496648 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-hostroot\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496678 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-systemd\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496709 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-var-lib-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496724 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d4f7966d-78bf-4cbb-a764-b066fe69e484-serviceca\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496740 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-socket-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496773 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-tuned\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.496794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496804 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496837 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-run-ovn-kubernetes\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496870 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-cni-multus\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496906 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d37887a7-2697-430c-834c-76614ddbb9b9-multus-daemon-config\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.496939 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysctl-conf\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497065 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497099 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-cni-bin\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497135 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-conf-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497169 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497218 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-env-overrides\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497221 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497347 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-lib-modules\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497382 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-cni-binary-copy\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysctl-conf\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497413 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-run-netns\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497446 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-kubelet\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497457 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-run-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.497532 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497480 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ct28\" (UniqueName: \"kubernetes.io/projected/0f27d004-e5fc-4560-8699-ce203c2bf77e-kube-api-access-5ct28\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497508 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-host-slash\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497513 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-run-ovn-kubernetes\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497540 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-socket-dir-parent\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497560 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-cni-multus\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497576 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-system-cni-dir\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497734 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-system-cni-dir\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497793 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497848 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-cni-bin\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.497895 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-conf-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498002 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d37887a7-2697-430c-834c-76614ddbb9b9-multus-daemon-config\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498021 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498039 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-cnibin\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498048 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bczbz\" (UniqueName: \"kubernetes.io/projected/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-kube-api-access-bczbz\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498091 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sfrv\" (UniqueName: \"kubernetes.io/projected/d37887a7-2697-430c-834c-76614ddbb9b9-kube-api-access-5sfrv\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498119 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-ovnkube-config\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498154 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:30.498248 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498183 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-systemd-units\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498268 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-node-log\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498300 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498327 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-iptables-alerter-script\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498337 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-systemd\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498359 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-ovnkube-script-lib\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498396 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-cni-bin\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498444 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-hostroot\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498465 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-var-lib-openvswitch\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498497 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-host-run-netns\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498607 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-lib-modules\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498950 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-ovnkube-script-lib\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.498977 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.498965 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f27d004-e5fc-4560-8699-ce203c2bf77e-cni-binary-copy\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499003 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-os-release\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499013 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-env-overrides\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499036 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499072 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysconfig\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499083 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-socket-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499102 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-sys\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499132 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-var-lib-kubelet\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499135 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-host-slash\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499163 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64568791-94bd-49ea-adf9-c39f5c4c8f08-tmp\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499202 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-host-var-lib-kubelet\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499207 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-registration-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499263 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-multus-socket-dir-parent\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499312 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-registration-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499331 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-node-log\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499389 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d024a606-d155-4b9c-9936-eff2f2e2603c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.499504 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499401 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-var-lib-kubelet\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499582 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysctl-d\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499734 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b023a24-7ef2-470d-8cd5-90366b171323-systemd-units\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499746 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-sysconfig\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499864 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.499930 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d37887a7-2697-430c-834c-76614ddbb9b9-system-cni-dir\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.499943 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.500034 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64568791-94bd-49ea-adf9-c39f5c4c8f08-sys\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.500109 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f27d004-e5fc-4560-8699-ce203c2bf77e-os-release\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.500173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.500034 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b023a24-7ef2-470d-8cd5-90366b171323-ovnkube-config\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.500623 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.500522 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-iptables-alerter-script\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.501532 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.501502 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:31.001457218 +0000 UTC m=+2.086906637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:30.502747 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.502729 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64568791-94bd-49ea-adf9-c39f5c4c8f08-tmp\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.502894 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.502860 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b023a24-7ef2-470d-8cd5-90366b171323-ovn-node-metrics-cert\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.502993 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.502979 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f7e2ea0-df8c-485d-a95a-e622c53fab2d-agent-certs\") pod \"konnectivity-agent-pbnvm\" (UID: \"5f7e2ea0-df8c-485d-a95a-e622c53fab2d\") " pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:30.503053 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.503036 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/64568791-94bd-49ea-adf9-c39f5c4c8f08-etc-tuned\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.506320 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.506295 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z84p8\" (UniqueName: \"kubernetes.io/projected/7b023a24-7ef2-470d-8cd5-90366b171323-kube-api-access-z84p8\") pod \"ovnkube-node-tttq4\" (UID: \"7b023a24-7ef2-470d-8cd5-90366b171323\") " pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.506430 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.506343 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xdr\" (UniqueName: \"kubernetes.io/projected/7682219f-20c8-40ee-a84d-c68d79df1dd8-kube-api-access-88xdr\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:30.507543 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.507518 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9tgx\" (UniqueName: \"kubernetes.io/projected/d4f7966d-78bf-4cbb-a764-b066fe69e484-kube-api-access-s9tgx\") pod \"node-ca-5m9b6\" (UID: \"d4f7966d-78bf-4cbb-a764-b066fe69e484\") " pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.507803 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.507785 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76g5\" (UniqueName: \"kubernetes.io/projected/1ac43e44-6bc5-4678-9022-029aed19a8c1-kube-api-access-w76g5\") pod \"node-resolver-8spcs\" (UID: \"1ac43e44-6bc5-4678-9022-029aed19a8c1\") " pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.508973 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.508957 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:30.509041 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.508976 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:30.509041 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.508990 2562 projected.go:194] Error preparing data for projected volume kube-api-access-8z2ld for pod openshift-network-diagnostics/network-check-target-7r5l5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:30.509177 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:30.509049 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld podName:167f8a10-92f4-444e-912f-415dafc03e58 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:31.009036133 +0000 UTC m=+2.094485522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8z2ld" (UniqueName: "kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld") pod "network-check-target-7r5l5" (UID: "167f8a10-92f4-444e-912f-415dafc03e58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:30.511107 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.511082 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sfrv\" (UniqueName: \"kubernetes.io/projected/d37887a7-2697-430c-834c-76614ddbb9b9-kube-api-access-5sfrv\") pod \"multus-rbhdn\" (UID: \"d37887a7-2697-430c-834c-76614ddbb9b9\") " pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.511258 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.511128 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l59tb\" (UniqueName: \"kubernetes.io/projected/d024a606-d155-4b9c-9936-eff2f2e2603c-kube-api-access-l59tb\") pod \"aws-ebs-csi-driver-node-vjqph\" (UID: \"d024a606-d155-4b9c-9936-eff2f2e2603c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.512723 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.512707 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ct28\" (UniqueName: \"kubernetes.io/projected/0f27d004-e5fc-4560-8699-ce203c2bf77e-kube-api-access-5ct28\") pod \"multus-additional-cni-plugins-87852\" (UID: \"0f27d004-e5fc-4560-8699-ce203c2bf77e\") " pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.512810 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.512710 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczbz\" (UniqueName: \"kubernetes.io/projected/2e16d78d-6a61-4210-b4d5-ecf12d2038ca-kube-api-access-bczbz\") pod \"iptables-alerter-tl27p\" (UID: \"2e16d78d-6a61-4210-b4d5-ecf12d2038ca\") " pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.514460 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.514436 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2gbb\" (UniqueName: \"kubernetes.io/projected/64568791-94bd-49ea-adf9-c39f5c4c8f08-kube-api-access-k2gbb\") pod \"tuned-zww8n\" (UID: \"64568791-94bd-49ea-adf9-c39f5c4c8f08\") " pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.517078 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.517064 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rbhdn" Apr 16 18:16:30.529653 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.529629 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tl27p" Apr 16 18:16:30.535016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.534997 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zww8n" Apr 16 18:16:30.580556 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.580517 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d4f0efd48f93f6cc380a4943e78ab2.slice/crio-a850f93d723479a6caa52fdb4bd0a02490cf8b9c6cdc7a15fd33278b13974971 WatchSource:0}: Error finding container a850f93d723479a6caa52fdb4bd0a02490cf8b9c6cdc7a15fd33278b13974971: Status 404 returned error can't find the container with id a850f93d723479a6caa52fdb4bd0a02490cf8b9c6cdc7a15fd33278b13974971 Apr 16 18:16:30.580991 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.580968 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96464c8f8d874d5fa7f0601b1e26dfe1.slice/crio-eddc941e71b4a63b6cbbfc1f70ecd94ba6818a4a26772be66dffb40060255e2d WatchSource:0}: Error finding container eddc941e71b4a63b6cbbfc1f70ecd94ba6818a4a26772be66dffb40060255e2d: Status 404 returned error can't find the container with id eddc941e71b4a63b6cbbfc1f70ecd94ba6818a4a26772be66dffb40060255e2d Apr 16 18:16:30.586429 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.586415 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:16:30.713118 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.713026 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:30.719017 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.718990 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f7e2ea0_df8c_485d_a95a_e622c53fab2d.slice/crio-4e9f03071d97f2b7a6be327c88d909fd1260665cd53285dc64cc1b54429285fc WatchSource:0}: Error finding container 4e9f03071d97f2b7a6be327c88d909fd1260665cd53285dc64cc1b54429285fc: Status 404 returned error can't find the container with id 4e9f03071d97f2b7a6be327c88d909fd1260665cd53285dc64cc1b54429285fc Apr 16 18:16:30.725829 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.725808 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5m9b6" Apr 16 18:16:30.731703 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.731681 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4f7966d_78bf_4cbb_a764_b066fe69e484.slice/crio-ac633640c5dbf7bc37ff5fdc16117654b36d1b93098ff01cb1c8203ff3ebfc86 WatchSource:0}: Error finding container ac633640c5dbf7bc37ff5fdc16117654b36d1b93098ff01cb1c8203ff3ebfc86: Status 404 returned error can't find the container with id ac633640c5dbf7bc37ff5fdc16117654b36d1b93098ff01cb1c8203ff3ebfc86 Apr 16 18:16:30.747098 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.747076 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-87852" Apr 16 18:16:30.754541 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.754523 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:30.754772 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.754744 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f27d004_e5fc_4560_8699_ce203c2bf77e.slice/crio-ec4214c1d16b3ffaab40e7beed3e8361cb9683746f09ff0d07298f85b4446aa2 WatchSource:0}: Error finding container ec4214c1d16b3ffaab40e7beed3e8361cb9683746f09ff0d07298f85b4446aa2: Status 404 returned error can't find the container with id ec4214c1d16b3ffaab40e7beed3e8361cb9683746f09ff0d07298f85b4446aa2 Apr 16 18:16:30.759393 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.759369 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8spcs" Apr 16 18:16:30.761480 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.761457 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b023a24_7ef2_470d_8cd5_90366b171323.slice/crio-2ba7ac65c0f4a42fa54100c6af6e0a4773317696df492bf5baf307e021cf041f WatchSource:0}: Error finding container 2ba7ac65c0f4a42fa54100c6af6e0a4773317696df492bf5baf307e021cf041f: Status 404 returned error can't find the container with id 2ba7ac65c0f4a42fa54100c6af6e0a4773317696df492bf5baf307e021cf041f Apr 16 18:16:30.766686 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.766664 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ac43e44_6bc5_4678_9022_029aed19a8c1.slice/crio-d1cd10bf8a99c624b11a6b1d654bb132ad9a87ddfe2eb458b74058c26600c232 WatchSource:0}: Error finding container d1cd10bf8a99c624b11a6b1d654bb132ad9a87ddfe2eb458b74058c26600c232: Status 404 returned error can't find the container with id d1cd10bf8a99c624b11a6b1d654bb132ad9a87ddfe2eb458b74058c26600c232 Apr 16 18:16:30.788606 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:30.788579 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" Apr 16 18:16:30.794370 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.794344 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd024a606_d155_4b9c_9936_eff2f2e2603c.slice/crio-65f5f6aa031cceff14a624b880366b63345a17cefb0e22f824e1165c02392fb4 WatchSource:0}: Error finding container 65f5f6aa031cceff14a624b880366b63345a17cefb0e22f824e1165c02392fb4: Status 404 returned error can't find the container with id 65f5f6aa031cceff14a624b880366b63345a17cefb0e22f824e1165c02392fb4 Apr 16 18:16:30.871179 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.871150 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e16d78d_6a61_4210_b4d5_ecf12d2038ca.slice/crio-a56251fe195e7a8830f90707be8a2c0357cf50580a22a8cf1c314c0bc5630930 WatchSource:0}: Error finding container a56251fe195e7a8830f90707be8a2c0357cf50580a22a8cf1c314c0bc5630930: Status 404 returned error can't find the container with id a56251fe195e7a8830f90707be8a2c0357cf50580a22a8cf1c314c0bc5630930 Apr 16 18:16:30.900343 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.900316 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64568791_94bd_49ea_adf9_c39f5c4c8f08.slice/crio-3fedb143656bd4f5ba0872234464877371d4373cc48253c8ffc811a56aeb1099 WatchSource:0}: Error finding container 3fedb143656bd4f5ba0872234464877371d4373cc48253c8ffc811a56aeb1099: Status 404 returned error can't find the container with id 3fedb143656bd4f5ba0872234464877371d4373cc48253c8ffc811a56aeb1099 Apr 16 18:16:30.962278 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:16:30.962210 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd37887a7_2697_430c_834c_76614ddbb9b9.slice/crio-af71f71d6516716e26a245d9a42a410b7a8be67493aa8fefbc4cad67246510c8 WatchSource:0}: Error finding container af71f71d6516716e26a245d9a42a410b7a8be67493aa8fefbc4cad67246510c8: Status 404 returned error can't find the container with id af71f71d6516716e26a245d9a42a410b7a8be67493aa8fefbc4cad67246510c8 Apr 16 18:16:31.002340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.002301 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:31.002516 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:31.002440 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:31.002568 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:31.002522 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:32.002501204 +0000 UTC m=+3.087950594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:31.103149 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.103113 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:31.103332 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:31.103291 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:31.103332 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:31.103310 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:31.103332 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:31.103324 2562 projected.go:194] Error preparing data for projected volume kube-api-access-8z2ld for pod openshift-network-diagnostics/network-check-target-7r5l5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:31.103485 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:31.103382 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld podName:167f8a10-92f4-444e-912f-415dafc03e58 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:32.103363919 +0000 UTC m=+3.188813301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8z2ld" (UniqueName: "kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld") pod "network-check-target-7r5l5" (UID: "167f8a10-92f4-444e-912f-415dafc03e58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:31.401284 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.401200 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:31.439297 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.439231 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:11:30 +0000 UTC" deadline="2027-11-17 04:05:45.451530062 +0000 UTC" Apr 16 18:16:31.439297 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.439267 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13905h49m14.012267243s" Apr 16 18:16:31.519619 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.519592 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:31.519778 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:31.519726 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:31.563596 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.563427 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87852" event={"ID":"0f27d004-e5fc-4560-8699-ce203c2bf77e","Type":"ContainerStarted","Data":"ec4214c1d16b3ffaab40e7beed3e8361cb9683746f09ff0d07298f85b4446aa2"} Apr 16 18:16:31.585878 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.585840 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" event={"ID":"96464c8f8d874d5fa7f0601b1e26dfe1","Type":"ContainerStarted","Data":"eddc941e71b4a63b6cbbfc1f70ecd94ba6818a4a26772be66dffb40060255e2d"} Apr 16 18:16:31.596108 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.596079 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:31.608705 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.608653 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" event={"ID":"69d4f0efd48f93f6cc380a4943e78ab2","Type":"ContainerStarted","Data":"a850f93d723479a6caa52fdb4bd0a02490cf8b9c6cdc7a15fd33278b13974971"} Apr 16 18:16:31.627308 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.627269 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rbhdn" event={"ID":"d37887a7-2697-430c-834c-76614ddbb9b9","Type":"ContainerStarted","Data":"af71f71d6516716e26a245d9a42a410b7a8be67493aa8fefbc4cad67246510c8"} Apr 16 18:16:31.647413 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.647376 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tl27p" event={"ID":"2e16d78d-6a61-4210-b4d5-ecf12d2038ca","Type":"ContainerStarted","Data":"a56251fe195e7a8830f90707be8a2c0357cf50580a22a8cf1c314c0bc5630930"} Apr 16 18:16:31.664524 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.664442 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8spcs" event={"ID":"1ac43e44-6bc5-4678-9022-029aed19a8c1","Type":"ContainerStarted","Data":"d1cd10bf8a99c624b11a6b1d654bb132ad9a87ddfe2eb458b74058c26600c232"} Apr 16 18:16:31.675112 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.675074 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5m9b6" event={"ID":"d4f7966d-78bf-4cbb-a764-b066fe69e484","Type":"ContainerStarted","Data":"ac633640c5dbf7bc37ff5fdc16117654b36d1b93098ff01cb1c8203ff3ebfc86"} Apr 16 18:16:31.686209 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.686041 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pbnvm" event={"ID":"5f7e2ea0-df8c-485d-a95a-e622c53fab2d","Type":"ContainerStarted","Data":"4e9f03071d97f2b7a6be327c88d909fd1260665cd53285dc64cc1b54429285fc"} Apr 16 18:16:31.713874 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.713828 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zww8n" event={"ID":"64568791-94bd-49ea-adf9-c39f5c4c8f08","Type":"ContainerStarted","Data":"3fedb143656bd4f5ba0872234464877371d4373cc48253c8ffc811a56aeb1099"} Apr 16 18:16:31.738851 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.738799 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" event={"ID":"d024a606-d155-4b9c-9936-eff2f2e2603c","Type":"ContainerStarted","Data":"65f5f6aa031cceff14a624b880366b63345a17cefb0e22f824e1165c02392fb4"} Apr 16 18:16:31.751301 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:31.751160 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"2ba7ac65c0f4a42fa54100c6af6e0a4773317696df492bf5baf307e021cf041f"} Apr 16 18:16:32.011100 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:32.010494 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:32.011100 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:32.010664 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:32.011100 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:32.010728 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:34.01070883 +0000 UTC m=+5.096158204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:32.022023 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:32.021818 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:32.111038 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:32.111000 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:32.111246 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:32.111173 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:32.111246 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:32.111207 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:32.111246 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:32.111221 2562 projected.go:194] Error preparing data for projected volume kube-api-access-8z2ld for pod openshift-network-diagnostics/network-check-target-7r5l5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:32.111415 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:32.111277 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld podName:167f8a10-92f4-444e-912f-415dafc03e58 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:34.111259235 +0000 UTC m=+5.196708615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8z2ld" (UniqueName: "kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld") pod "network-check-target-7r5l5" (UID: "167f8a10-92f4-444e-912f-415dafc03e58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:32.439702 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:32.439586 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:11:30 +0000 UTC" deadline="2027-09-16 16:42:40.6513589 +0000 UTC" Apr 16 18:16:32.439702 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:32.439629 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12430h26m8.211733841s" Apr 16 18:16:32.520014 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:32.519908 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:32.520207 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:32.520135 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:33.521994 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:33.521474 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:33.521994 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:33.521596 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:34.028005 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:34.027910 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:34.028369 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:34.028085 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:34.028369 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:34.028168 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:38.028139186 +0000 UTC m=+9.113588556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:34.128938 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:34.128900 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:34.129129 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:34.129087 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:34.129129 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:34.129113 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:34.129129 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:34.129127 2562 projected.go:194] Error preparing data for projected volume kube-api-access-8z2ld for pod openshift-network-diagnostics/network-check-target-7r5l5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:34.129571 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:34.129203 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld podName:167f8a10-92f4-444e-912f-415dafc03e58 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:38.129170369 +0000 UTC m=+9.214619758 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8z2ld" (UniqueName: "kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld") pod "network-check-target-7r5l5" (UID: "167f8a10-92f4-444e-912f-415dafc03e58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:34.519820 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:34.519781 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:34.520013 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:34.519936 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:35.520152 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:35.519652 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:35.520152 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:35.519789 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:36.519027 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:36.518988 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:36.519244 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:36.519140 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:37.519927 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:37.519896 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:37.520419 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:37.520031 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:38.062589 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:38.062549 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:38.062766 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:38.062750 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:38.062849 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:38.062827 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:46.062806818 +0000 UTC m=+17.148256192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:38.163813 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:38.163697 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:38.163991 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:38.163881 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:38.163991 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:38.163899 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:38.163991 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:38.163911 2562 projected.go:194] Error preparing data for projected volume kube-api-access-8z2ld for pod openshift-network-diagnostics/network-check-target-7r5l5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:38.163991 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:38.163974 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld podName:167f8a10-92f4-444e-912f-415dafc03e58 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:46.163955652 +0000 UTC m=+17.249405035 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8z2ld" (UniqueName: "kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld") pod "network-check-target-7r5l5" (UID: "167f8a10-92f4-444e-912f-415dafc03e58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:38.519767 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:38.519681 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:38.519957 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:38.519826 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:39.357155 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.357063 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-q5k8m"] Apr 16 18:16:39.361768 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.361738 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.361951 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:39.361828 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:39.473805 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.473748 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97a27848-fb3c-407c-8213-4e08944e760a-dbus\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.473985 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.473813 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.473985 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.473880 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97a27848-fb3c-407c-8213-4e08944e760a-kubelet-config\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.520664 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.520611 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:39.521115 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:39.520731 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:39.574470 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.574436 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.574712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.574519 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97a27848-fb3c-407c-8213-4e08944e760a-kubelet-config\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.574712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.574578 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97a27848-fb3c-407c-8213-4e08944e760a-dbus\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.574915 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.574762 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97a27848-fb3c-407c-8213-4e08944e760a-dbus\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.574915 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:39.574827 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:39.574915 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:39.574836 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97a27848-fb3c-407c-8213-4e08944e760a-kubelet-config\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:39.574915 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:39.574898 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret podName:97a27848-fb3c-407c-8213-4e08944e760a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:40.074876111 +0000 UTC m=+11.160325497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret") pod "global-pull-secret-syncer-q5k8m" (UID: "97a27848-fb3c-407c-8213-4e08944e760a") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:40.077511 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:40.077473 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:40.077710 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:40.077599 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:40.077710 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:40.077681 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret podName:97a27848-fb3c-407c-8213-4e08944e760a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:41.077662335 +0000 UTC m=+12.163111709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret") pod "global-pull-secret-syncer-q5k8m" (UID: "97a27848-fb3c-407c-8213-4e08944e760a") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:40.519639 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:40.519612 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:40.519758 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:40.519727 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:41.084267 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:41.084218 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:41.084714 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:41.084374 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:41.084714 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:41.084452 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret podName:97a27848-fb3c-407c-8213-4e08944e760a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:43.084433866 +0000 UTC m=+14.169883252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret") pod "global-pull-secret-syncer-q5k8m" (UID: "97a27848-fb3c-407c-8213-4e08944e760a") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:41.519949 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:41.519911 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:41.520138 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:41.520032 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:41.520138 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:41.520127 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:41.520277 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:41.520247 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:42.519594 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:42.519564 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:42.520010 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:42.519698 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:43.098073 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:43.098029 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:43.098276 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:43.098217 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:43.098353 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:43.098304 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret podName:97a27848-fb3c-407c-8213-4e08944e760a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:47.098282596 +0000 UTC m=+18.183731986 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret") pod "global-pull-secret-syncer-q5k8m" (UID: "97a27848-fb3c-407c-8213-4e08944e760a") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:43.519268 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:43.519237 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:43.519446 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:43.519342 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:43.519446 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:43.519359 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:43.519613 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:43.519471 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:44.519943 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:44.519904 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:44.520438 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:44.520057 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:45.519856 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:45.519823 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:45.519856 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:45.519847 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:45.520372 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:45.519932 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:45.520372 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:45.520047 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:46.117434 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:46.117391 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:46.117660 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:46.117571 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:46.117660 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:46.117653 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:02.117633435 +0000 UTC m=+33.203082817 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:46.217959 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:46.217921 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:46.218105 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:46.218052 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:46.218105 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:46.218067 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:46.218105 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:46.218076 2562 projected.go:194] Error preparing data for projected volume kube-api-access-8z2ld for pod openshift-network-diagnostics/network-check-target-7r5l5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:46.218271 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:46.218126 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld podName:167f8a10-92f4-444e-912f-415dafc03e58 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:02.218112311 +0000 UTC m=+33.303561684 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8z2ld" (UniqueName: "kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld") pod "network-check-target-7r5l5" (UID: "167f8a10-92f4-444e-912f-415dafc03e58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:46.519791 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:46.519758 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:46.519973 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:46.519875 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:47.123920 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:47.123883 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:47.124276 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:47.124036 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:47.124276 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:47.124103 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret podName:97a27848-fb3c-407c-8213-4e08944e760a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:55.124087161 +0000 UTC m=+26.209536534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret") pod "global-pull-secret-syncer-q5k8m" (UID: "97a27848-fb3c-407c-8213-4e08944e760a") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:47.519323 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:47.519284 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:47.519496 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:47.519284 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:47.519496 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:47.519429 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:47.519623 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:47.519499 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:48.519854 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:48.519658 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:48.520315 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:48.519950 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:49.520631 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.520141 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:49.520631 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:49.520515 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:49.520631 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.520330 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:49.520631 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:49.520606 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:49.786390 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.786341 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zww8n" event={"ID":"64568791-94bd-49ea-adf9-c39f5c4c8f08","Type":"ContainerStarted","Data":"e3ee31b552cd07772e8284092fa53ec9223540f209ab94ff8c33659022b3422d"} Apr 16 18:16:49.790573 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.790548 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:16:49.791000 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.790969 2562 generic.go:358] "Generic (PLEG): container finished" podID="7b023a24-7ef2-470d-8cd5-90366b171323" containerID="23609817cf96173a745690e42d031d6cd5a267a3d73f1809b4c587aa18306a07" exitCode=1 Apr 16 18:16:49.791152 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.791041 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"14096ebb367d40efde09759341a40e5d92b079595057b2e3071ec421ad2cd8ff"} Apr 16 18:16:49.791152 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.791071 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"370785e2dc0031dab02180ac1088f3506daee9db2181f77bd4a7c98f5f9a1c3e"} Apr 16 18:16:49.791152 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.791081 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"45e42de2dbfdb78b30683d26588366ddb3a179d52b9c489308459be3843510b5"} Apr 16 18:16:49.791152 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.791094 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"9de7f725f398ceb6ee2d24517f4da6b03276ab511ce61dc3995742e90fdf7d24"} Apr 16 18:16:49.791152 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.791106 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerDied","Data":"23609817cf96173a745690e42d031d6cd5a267a3d73f1809b4c587aa18306a07"} Apr 16 18:16:49.791152 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.791121 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"d2845146fcc8ccfbb5828dccfa3dfad12349d8811ec47c91dee4dfcbdce77bbd"} Apr 16 18:16:49.794317 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.794268 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" event={"ID":"69d4f0efd48f93f6cc380a4943e78ab2","Type":"ContainerStarted","Data":"0a1fb243a22363bb3345c80cb900813998c8fd29c8e35ca38705e92a9cb01e56"} Apr 16 18:16:49.795848 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.795814 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rbhdn" event={"ID":"d37887a7-2697-430c-834c-76614ddbb9b9","Type":"ContainerStarted","Data":"fdde25853c82d3842cdf5ef7d41fa857fbebe61dfd6d3eb38b5127dd30d5c38a"} Apr 16 18:16:49.829352 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.828006 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rbhdn" podStartSLOduration=2.905360285 podStartE2EDuration="20.827956864s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.964208921 +0000 UTC m=+2.049658296" lastFinishedPulling="2026-04-16 18:16:48.88680549 +0000 UTC m=+19.972254875" observedRunningTime="2026-04-16 18:16:49.826618184 +0000 UTC m=+20.912067572" watchObservedRunningTime="2026-04-16 18:16:49.827956864 +0000 UTC m=+20.913406256" Apr 16 18:16:49.829352 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.828362 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zww8n" podStartSLOduration=2.864849273 podStartE2EDuration="20.828353812s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.901758785 +0000 UTC m=+1.987208155" lastFinishedPulling="2026-04-16 18:16:48.865263326 +0000 UTC m=+19.950712694" observedRunningTime="2026-04-16 18:16:49.805584145 +0000 UTC m=+20.891033545" watchObservedRunningTime="2026-04-16 18:16:49.828353812 +0000 UTC m=+20.913803195" Apr 16 18:16:49.846082 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:49.845611 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-68.ec2.internal" podStartSLOduration=19.845593347 podStartE2EDuration="19.845593347s" podCreationTimestamp="2026-04-16 18:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:16:49.844884988 +0000 UTC m=+20.930334382" watchObservedRunningTime="2026-04-16 18:16:49.845593347 +0000 UTC m=+20.931042738" Apr 16 18:16:50.520047 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.519811 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:50.520223 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:50.520139 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:50.799162 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.799117 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tl27p" event={"ID":"2e16d78d-6a61-4210-b4d5-ecf12d2038ca","Type":"ContainerStarted","Data":"8d941652dddf7d71ac456fb3cc894e07d77868a89965e93a4fe9df54141a882b"} Apr 16 18:16:50.800721 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.800688 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8spcs" event={"ID":"1ac43e44-6bc5-4678-9022-029aed19a8c1","Type":"ContainerStarted","Data":"f12e61a75d264f91792553b417faa050e95687fc0f2d742b9f55e97a9551e3d3"} Apr 16 18:16:50.802022 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.801975 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5m9b6" event={"ID":"d4f7966d-78bf-4cbb-a764-b066fe69e484","Type":"ContainerStarted","Data":"fea67d909d14e123fe96f10713b4d131a44e435071829d71bce6b4f6cf396ba7"} Apr 16 18:16:50.803457 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.803431 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pbnvm" event={"ID":"5f7e2ea0-df8c-485d-a95a-e622c53fab2d","Type":"ContainerStarted","Data":"256ea84e0a8a5ad3840e0567cfef3eecd94c1f91993fbb7008b6f9de0775a65e"} Apr 16 18:16:50.804939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.804915 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" event={"ID":"d024a606-d155-4b9c-9936-eff2f2e2603c","Type":"ContainerStarted","Data":"9e6223a76093c7f7449628d004c8b7a045fdd9389f459a22e9f2c15fd7a27e39"} Apr 16 18:16:50.806394 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.806371 2562 generic.go:358] "Generic (PLEG): container finished" podID="0f27d004-e5fc-4560-8699-ce203c2bf77e" containerID="cdd91ee8251340d0e2c592a48b20499e271c7ee0409762bbdf6f1fa4d9324b64" exitCode=0 Apr 16 18:16:50.806493 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.806438 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87852" event={"ID":"0f27d004-e5fc-4560-8699-ce203c2bf77e","Type":"ContainerDied","Data":"cdd91ee8251340d0e2c592a48b20499e271c7ee0409762bbdf6f1fa4d9324b64"} Apr 16 18:16:50.808007 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.807982 2562 generic.go:358] "Generic (PLEG): container finished" podID="96464c8f8d874d5fa7f0601b1e26dfe1" containerID="14f95eff6da0725f39314aed773981891852c4d73a20b694731a9333515bbd1d" exitCode=0 Apr 16 18:16:50.808076 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.808039 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" event={"ID":"96464c8f8d874d5fa7f0601b1e26dfe1","Type":"ContainerDied","Data":"14f95eff6da0725f39314aed773981891852c4d73a20b694731a9333515bbd1d"} Apr 16 18:16:50.815261 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.815183 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tl27p" podStartSLOduration=4.04698217 podStartE2EDuration="21.815136185s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.872899692 +0000 UTC m=+1.958349065" lastFinishedPulling="2026-04-16 18:16:48.641053707 +0000 UTC m=+19.726503080" observedRunningTime="2026-04-16 18:16:50.814431944 +0000 UTC m=+21.899881350" watchObservedRunningTime="2026-04-16 18:16:50.815136185 +0000 UTC m=+21.900585576" Apr 16 18:16:50.829941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.829888 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8spcs" podStartSLOduration=3.7334104679999998 podStartE2EDuration="21.829870544s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.768857131 +0000 UTC m=+1.854306501" lastFinishedPulling="2026-04-16 18:16:48.865317205 +0000 UTC m=+19.950766577" observedRunningTime="2026-04-16 18:16:50.829275097 +0000 UTC m=+21.914724500" watchObservedRunningTime="2026-04-16 18:16:50.829870544 +0000 UTC m=+21.915319936" Apr 16 18:16:50.868260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.868174 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5m9b6" podStartSLOduration=3.876499388 podStartE2EDuration="21.868155206s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.73296066 +0000 UTC m=+1.818410028" lastFinishedPulling="2026-04-16 18:16:48.724616463 +0000 UTC m=+19.810065846" observedRunningTime="2026-04-16 18:16:50.8457803 +0000 UTC m=+21.931229691" watchObservedRunningTime="2026-04-16 18:16:50.868155206 +0000 UTC m=+21.953604600" Apr 16 18:16:50.901397 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:50.901338 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pbnvm" podStartSLOduration=3.980747999 podStartE2EDuration="21.901319369s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.720486184 +0000 UTC m=+1.805935554" lastFinishedPulling="2026-04-16 18:16:48.641057544 +0000 UTC m=+19.726506924" observedRunningTime="2026-04-16 18:16:50.900996749 +0000 UTC m=+21.986446141" watchObservedRunningTime="2026-04-16 18:16:50.901319369 +0000 UTC m=+21.986768762" Apr 16 18:16:51.003912 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.003889 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:16:51.455072 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.454978 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:16:51.003907156Z","UUID":"a78bcd64-93aa-4a4c-b488-974f96a72035","Handler":null,"Name":"","Endpoint":""} Apr 16 18:16:51.456887 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.456863 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:16:51.456887 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.456893 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:16:51.523397 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.523369 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:51.523564 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:51.523486 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:51.523782 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.523369 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:51.523782 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:51.523746 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:51.812743 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.812704 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" event={"ID":"d024a606-d155-4b9c-9936-eff2f2e2603c","Type":"ContainerStarted","Data":"04f3ff1a4019dcbd9990bd205309a3bf07ce119f891920e8fe6e0d426ebd3fab"} Apr 16 18:16:51.814781 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.814741 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" event={"ID":"96464c8f8d874d5fa7f0601b1e26dfe1","Type":"ContainerStarted","Data":"06418f3616b886bfb8bb62d4970b3ba9ac9074262c1316cf58102166678eef10"} Apr 16 18:16:51.834094 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:51.834041 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-68.ec2.internal" podStartSLOduration=21.834023847 podStartE2EDuration="21.834023847s" podCreationTimestamp="2026-04-16 18:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:16:51.833360311 +0000 UTC m=+22.918809706" watchObservedRunningTime="2026-04-16 18:16:51.834023847 +0000 UTC m=+22.919473236" Apr 16 18:16:52.519600 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:52.519564 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:52.519782 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:52.519692 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:52.819167 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:52.819018 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" event={"ID":"d024a606-d155-4b9c-9936-eff2f2e2603c","Type":"ContainerStarted","Data":"52dfbe735f000358c741a7899be7dfaf0eb614f55de77c7191d391b109b8d35f"} Apr 16 18:16:52.822256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:52.822228 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:16:52.822948 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:52.822921 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"a5c8a3c363bc8f17b061588064f5e7a8af172e839400c48870c1a5264ebdabfc"} Apr 16 18:16:52.838522 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:52.838467 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjqph" podStartSLOduration=2.690496933 podStartE2EDuration="23.838446698s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.795904162 +0000 UTC m=+1.881353532" lastFinishedPulling="2026-04-16 18:16:51.943853911 +0000 UTC m=+23.029303297" observedRunningTime="2026-04-16 18:16:52.837782462 +0000 UTC m=+23.923231854" watchObservedRunningTime="2026-04-16 18:16:52.838446698 +0000 UTC m=+23.923896086" Apr 16 18:16:53.519423 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:53.519382 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:53.519600 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:53.519430 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:53.519600 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:53.519514 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:53.519685 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:53.519624 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:54.519470 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:54.519437 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:54.519865 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:54.519569 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:55.182913 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.182706 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:55.183056 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:55.182859 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:55.183056 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:55.182987 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret podName:97a27848-fb3c-407c-8213-4e08944e760a nodeName:}" failed. No retries permitted until 2026-04-16 18:17:11.182973646 +0000 UTC m=+42.268423014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret") pod "global-pull-secret-syncer-q5k8m" (UID: "97a27848-fb3c-407c-8213-4e08944e760a") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:55.325225 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.325111 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:55.325768 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.325751 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:55.519213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.519161 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:55.519213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.519174 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:55.519411 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:55.519278 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:55.519445 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:55.519411 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:55.830535 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.830372 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:16:55.831276 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.830845 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"2b19fb2c0f8ad7f7c3c0f62c413c693e7ea6d27f7d0e5407ec19658f4db0223a"} Apr 16 18:16:55.831276 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.831109 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:55.831276 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.831137 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:55.831439 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.831329 2562 scope.go:117] "RemoveContainer" containerID="23609817cf96173a745690e42d031d6cd5a267a3d73f1809b4c587aa18306a07" Apr 16 18:16:55.832624 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.832600 2562 generic.go:358] "Generic (PLEG): container finished" podID="0f27d004-e5fc-4560-8699-ce203c2bf77e" containerID="4d90e8bb1d7915f266d872be3e20f5ed2704546244e0d6efec3cb0e8c451eb0f" exitCode=0 Apr 16 18:16:55.832765 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.832682 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87852" event={"ID":"0f27d004-e5fc-4560-8699-ce203c2bf77e","Type":"ContainerDied","Data":"4d90e8bb1d7915f266d872be3e20f5ed2704546244e0d6efec3cb0e8c451eb0f"} Apr 16 18:16:55.832963 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.832943 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:55.833491 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.833409 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pbnvm" Apr 16 18:16:55.847156 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:55.847130 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:56.519491 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.519464 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:56.519676 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:56.519565 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:56.817800 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.817717 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q5k8m"] Apr 16 18:16:56.817928 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.817840 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:56.817972 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:56.817927 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:56.820375 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.820350 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vgjf"] Apr 16 18:16:56.834260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.834235 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7r5l5"] Apr 16 18:16:56.834628 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.834345 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:56.834628 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:56.834426 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:56.839171 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.839152 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:16:56.839585 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.839539 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" event={"ID":"7b023a24-7ef2-470d-8cd5-90366b171323","Type":"ContainerStarted","Data":"c726fef223585cde95a6f70f744219dc2677a450ea394ef939a02b6040f04ed9"} Apr 16 18:16:56.839927 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.839897 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:56.841637 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.841618 2562 generic.go:358] "Generic (PLEG): container finished" podID="0f27d004-e5fc-4560-8699-ce203c2bf77e" containerID="8f1c4e60a9b652f2c3c9ec559c433bffa599c81c0ccfdf58d2021abec7736fd2" exitCode=0 Apr 16 18:16:56.841740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.841695 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87852" event={"ID":"0f27d004-e5fc-4560-8699-ce203c2bf77e","Type":"ContainerDied","Data":"8f1c4e60a9b652f2c3c9ec559c433bffa599c81c0ccfdf58d2021abec7736fd2"} Apr 16 18:16:56.841740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.841724 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:56.841986 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:56.841960 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:16:56.855875 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.855849 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:16:56.875474 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:56.875418 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" podStartSLOduration=9.299297029 podStartE2EDuration="27.87540026s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.762957228 +0000 UTC m=+1.848406599" lastFinishedPulling="2026-04-16 18:16:49.339060444 +0000 UTC m=+20.424509830" observedRunningTime="2026-04-16 18:16:56.8745331 +0000 UTC m=+27.959982531" watchObservedRunningTime="2026-04-16 18:16:56.87540026 +0000 UTC m=+27.960849652" Apr 16 18:16:57.846208 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:57.846103 2562 generic.go:358] "Generic (PLEG): container finished" podID="0f27d004-e5fc-4560-8699-ce203c2bf77e" containerID="4be081078ebfef6f8426fe957765654b3b81c5ba429ac39f6b56ef717f6bc263" exitCode=0 Apr 16 18:16:57.846555 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:57.846184 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87852" event={"ID":"0f27d004-e5fc-4560-8699-ce203c2bf77e","Type":"ContainerDied","Data":"4be081078ebfef6f8426fe957765654b3b81c5ba429ac39f6b56ef717f6bc263"} Apr 16 18:16:58.519498 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:58.519456 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:16:58.519666 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:58.519513 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:16:58.519666 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:58.519589 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:16:58.519666 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:58.519643 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:16:58.519824 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:16:58.519690 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:16:58.519824 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:16:58.519782 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:17:00.519406 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:00.519092 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:17:00.519406 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:00.519092 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:17:00.519859 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:00.519427 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q5k8m" podUID="97a27848-fb3c-407c-8213-4e08944e760a" Apr 16 18:17:00.519859 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:00.519090 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:17:00.519859 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:00.519482 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7r5l5" podUID="167f8a10-92f4-444e-912f-415dafc03e58" Apr 16 18:17:00.519859 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:00.519602 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:17:02.141659 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.141615 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:17:02.142108 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.141776 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:02.142108 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.141839 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:34.141820166 +0000 UTC m=+65.227269555 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:02.212754 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.212721 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-68.ec2.internal" event="NodeReady" Apr 16 18:17:02.212936 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.212885 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:17:02.242555 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.242522 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:17:02.242741 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.242682 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:02.242741 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.242705 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:02.242741 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.242716 2562 projected.go:194] Error preparing data for projected volume kube-api-access-8z2ld for pod openshift-network-diagnostics/network-check-target-7r5l5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:02.242884 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.242777 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld podName:167f8a10-92f4-444e-912f-415dafc03e58 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:34.242753874 +0000 UTC m=+65.328203246 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8z2ld" (UniqueName: "kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld") pod "network-check-target-7r5l5" (UID: "167f8a10-92f4-444e-912f-415dafc03e58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:02.261801 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.261747 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bz65j"] Apr 16 18:17:02.289988 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.289960 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-646jh"] Apr 16 18:17:02.290178 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.290150 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.292534 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.292509 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:17:02.292534 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.292511 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:17:02.292723 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.292567 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz4g7\"" Apr 16 18:17:02.311354 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.311325 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-646jh"] Apr 16 18:17:02.311354 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.311358 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bz65j"] Apr 16 18:17:02.311582 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.311433 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:02.313975 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.313941 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:17:02.314115 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.313988 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:17:02.314115 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.313999 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4b92n\"" Apr 16 18:17:02.314115 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.313987 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:17:02.445004 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.444967 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e33c04f3-e414-4174-8047-0f84ece6cd5d-config-volume\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.445242 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.445018 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.445242 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.445080 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfvh\" (UniqueName: \"kubernetes.io/projected/e33c04f3-e414-4174-8047-0f84ece6cd5d-kube-api-access-4kfvh\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.445242 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.445110 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e33c04f3-e414-4174-8047-0f84ece6cd5d-tmp-dir\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.445242 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.445226 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:02.445426 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.445270 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2kl\" (UniqueName: \"kubernetes.io/projected/cea240e4-3f15-45d5-a754-105ae5e43a47-kube-api-access-2x2kl\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:02.519592 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.519514 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:17:02.519746 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.519517 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:17:02.520033 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.519535 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:17:02.522669 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.522646 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pwdr4\"" Apr 16 18:17:02.522991 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.522974 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:17:02.523310 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.523294 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:17:02.523552 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.523529 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p8pr6\"" Apr 16 18:17:02.523776 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.523758 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:17:02.523978 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.523949 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:17:02.545657 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.545635 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:02.545764 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.545678 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x2kl\" (UniqueName: \"kubernetes.io/projected/cea240e4-3f15-45d5-a754-105ae5e43a47-kube-api-access-2x2kl\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:02.545764 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.545709 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e33c04f3-e414-4174-8047-0f84ece6cd5d-config-volume\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.545764 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.545735 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.545924 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.545777 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:02.545924 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.545806 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfvh\" (UniqueName: \"kubernetes.io/projected/e33c04f3-e414-4174-8047-0f84ece6cd5d-kube-api-access-4kfvh\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.545924 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.545840 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert podName:cea240e4-3f15-45d5-a754-105ae5e43a47 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:03.04581959 +0000 UTC m=+34.131268972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert") pod "ingress-canary-646jh" (UID: "cea240e4-3f15-45d5-a754-105ae5e43a47") : secret "canary-serving-cert" not found Apr 16 18:17:02.545924 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.545888 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e33c04f3-e414-4174-8047-0f84ece6cd5d-tmp-dir\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.545924 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.545900 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:02.546143 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:02.545955 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls podName:e33c04f3-e414-4174-8047-0f84ece6cd5d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:03.045937737 +0000 UTC m=+34.131387119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls") pod "dns-default-bz65j" (UID: "e33c04f3-e414-4174-8047-0f84ece6cd5d") : secret "dns-default-metrics-tls" not found Apr 16 18:17:02.546237 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.546218 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e33c04f3-e414-4174-8047-0f84ece6cd5d-tmp-dir\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.553005 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.552981 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e33c04f3-e414-4174-8047-0f84ece6cd5d-config-volume\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.556855 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.556829 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfvh\" (UniqueName: \"kubernetes.io/projected/e33c04f3-e414-4174-8047-0f84ece6cd5d-kube-api-access-4kfvh\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:02.556947 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:02.556893 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x2kl\" (UniqueName: \"kubernetes.io/projected/cea240e4-3f15-45d5-a754-105ae5e43a47-kube-api-access-2x2kl\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:03.049766 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:03.049725 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:03.050018 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:03.049838 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:03.050018 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:03.049885 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:03.050018 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:03.049915 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:03.050018 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:03.049954 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls podName:e33c04f3-e414-4174-8047-0f84ece6cd5d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:04.049934812 +0000 UTC m=+35.135384185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls") pod "dns-default-bz65j" (UID: "e33c04f3-e414-4174-8047-0f84ece6cd5d") : secret "dns-default-metrics-tls" not found Apr 16 18:17:03.050018 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:03.049973 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert podName:cea240e4-3f15-45d5-a754-105ae5e43a47 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:04.049963527 +0000 UTC m=+35.135412900 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert") pod "ingress-canary-646jh" (UID: "cea240e4-3f15-45d5-a754-105ae5e43a47") : secret "canary-serving-cert" not found Apr 16 18:17:04.056213 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:04.056157 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:04.056688 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:04.056220 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:04.056688 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:04.056335 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:04.056688 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:04.056366 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:04.056688 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:04.056420 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert podName:cea240e4-3f15-45d5-a754-105ae5e43a47 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:06.056396778 +0000 UTC m=+37.141846168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert") pod "ingress-canary-646jh" (UID: "cea240e4-3f15-45d5-a754-105ae5e43a47") : secret "canary-serving-cert" not found Apr 16 18:17:04.056688 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:04.056449 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls podName:e33c04f3-e414-4174-8047-0f84ece6cd5d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:06.056434455 +0000 UTC m=+37.141883828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls") pod "dns-default-bz65j" (UID: "e33c04f3-e414-4174-8047-0f84ece6cd5d") : secret "dns-default-metrics-tls" not found Apr 16 18:17:04.862227 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:04.862177 2562 generic.go:358] "Generic (PLEG): container finished" podID="0f27d004-e5fc-4560-8699-ce203c2bf77e" containerID="adc89133b9bd6d761f29ab00555d3fad1db9fa419224c4ebffd5875d12ac76b9" exitCode=0 Apr 16 18:17:04.862378 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:04.862238 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87852" event={"ID":"0f27d004-e5fc-4560-8699-ce203c2bf77e","Type":"ContainerDied","Data":"adc89133b9bd6d761f29ab00555d3fad1db9fa419224c4ebffd5875d12ac76b9"} Apr 16 18:17:05.866406 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:05.866375 2562 generic.go:358] "Generic (PLEG): container finished" podID="0f27d004-e5fc-4560-8699-ce203c2bf77e" containerID="4cc69851ed24f9304d31fc048a4a099726f6802319251ef570aca5dcd607cebe" exitCode=0 Apr 16 18:17:05.866876 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:05.866428 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87852" event={"ID":"0f27d004-e5fc-4560-8699-ce203c2bf77e","Type":"ContainerDied","Data":"4cc69851ed24f9304d31fc048a4a099726f6802319251ef570aca5dcd607cebe"} Apr 16 18:17:06.072327 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:06.072290 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:06.072519 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:06.072409 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:06.072519 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:06.072443 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:06.072519 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:06.072510 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls podName:e33c04f3-e414-4174-8047-0f84ece6cd5d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:10.07249253 +0000 UTC m=+41.157941903 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls") pod "dns-default-bz65j" (UID: "e33c04f3-e414-4174-8047-0f84ece6cd5d") : secret "dns-default-metrics-tls" not found Apr 16 18:17:06.072519 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:06.072516 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:06.072689 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:06.072565 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert podName:cea240e4-3f15-45d5-a754-105ae5e43a47 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:10.072547689 +0000 UTC m=+41.157997078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert") pod "ingress-canary-646jh" (UID: "cea240e4-3f15-45d5-a754-105ae5e43a47") : secret "canary-serving-cert" not found Apr 16 18:17:06.871294 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:06.871254 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87852" event={"ID":"0f27d004-e5fc-4560-8699-ce203c2bf77e","Type":"ContainerStarted","Data":"024e82e5da978724c8cc98d4199e32db75f33d35e28b1ca64b60758c2391a600"} Apr 16 18:17:06.899303 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:06.899246 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-87852" podStartSLOduration=4.882056674 podStartE2EDuration="37.899233138s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:30.756380024 +0000 UTC m=+1.841829392" lastFinishedPulling="2026-04-16 18:17:03.773556484 +0000 UTC m=+34.859005856" observedRunningTime="2026-04-16 18:17:06.898986272 +0000 UTC m=+37.984435666" watchObservedRunningTime="2026-04-16 18:17:06.899233138 +0000 UTC m=+37.984682528" Apr 16 18:17:10.102475 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:10.102437 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:10.102955 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:10.102487 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:10.102955 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:10.102592 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:10.102955 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:10.102655 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert podName:cea240e4-3f15-45d5-a754-105ae5e43a47 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:18.102640153 +0000 UTC m=+49.188089527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert") pod "ingress-canary-646jh" (UID: "cea240e4-3f15-45d5-a754-105ae5e43a47") : secret "canary-serving-cert" not found Apr 16 18:17:10.102955 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:10.102669 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:10.102955 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:10.102719 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls podName:e33c04f3-e414-4174-8047-0f84ece6cd5d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:18.102702349 +0000 UTC m=+49.188151734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls") pod "dns-default-bz65j" (UID: "e33c04f3-e414-4174-8047-0f84ece6cd5d") : secret "dns-default-metrics-tls" not found Apr 16 18:17:11.210476 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:11.210434 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:17:11.213527 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:11.213502 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97a27848-fb3c-407c-8213-4e08944e760a-original-pull-secret\") pod \"global-pull-secret-syncer-q5k8m\" (UID: \"97a27848-fb3c-407c-8213-4e08944e760a\") " pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:17:11.246900 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:11.246864 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q5k8m" Apr 16 18:17:11.424362 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:11.424332 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q5k8m"] Apr 16 18:17:11.427770 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:17:11.427746 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a27848_fb3c_407c_8213_4e08944e760a.slice/crio-009574dc9f35ccf4e77856dff8d54a16e4e85b469dfc8c37fcb3e9c838cd76eb WatchSource:0}: Error finding container 009574dc9f35ccf4e77856dff8d54a16e4e85b469dfc8c37fcb3e9c838cd76eb: Status 404 returned error can't find the container with id 009574dc9f35ccf4e77856dff8d54a16e4e85b469dfc8c37fcb3e9c838cd76eb Apr 16 18:17:11.880939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:11.880899 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q5k8m" event={"ID":"97a27848-fb3c-407c-8213-4e08944e760a","Type":"ContainerStarted","Data":"009574dc9f35ccf4e77856dff8d54a16e4e85b469dfc8c37fcb3e9c838cd76eb"} Apr 16 18:17:15.889911 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:15.889877 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q5k8m" event={"ID":"97a27848-fb3c-407c-8213-4e08944e760a","Type":"ContainerStarted","Data":"18e8d61f0ca5ca496f64b7b6d49654d5de5c725020a3d3652418a30eebd4fe17"} Apr 16 18:17:15.906060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:15.906012 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-q5k8m" podStartSLOduration=32.97676798 podStartE2EDuration="36.905999334s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:17:11.429612175 +0000 UTC m=+42.515061558" lastFinishedPulling="2026-04-16 18:17:15.358843539 +0000 UTC m=+46.444292912" observedRunningTime="2026-04-16 18:17:15.905432625 +0000 UTC m=+46.990882016" watchObservedRunningTime="2026-04-16 18:17:15.905999334 +0000 UTC m=+46.991448724" Apr 16 18:17:18.163712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:18.163670 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:18.164101 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:18.163718 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:18.164101 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:18.163846 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:18.164101 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:18.163905 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls podName:e33c04f3-e414-4174-8047-0f84ece6cd5d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:34.163891077 +0000 UTC m=+65.249340449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls") pod "dns-default-bz65j" (UID: "e33c04f3-e414-4174-8047-0f84ece6cd5d") : secret "dns-default-metrics-tls" not found Apr 16 18:17:18.164101 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:18.163846 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:18.164101 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:18.164003 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert podName:cea240e4-3f15-45d5-a754-105ae5e43a47 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:34.163989817 +0000 UTC m=+65.249439201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert") pod "ingress-canary-646jh" (UID: "cea240e4-3f15-45d5-a754-105ae5e43a47") : secret "canary-serving-cert" not found Apr 16 18:17:28.858832 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:28.858803 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tttq4" Apr 16 18:17:34.177442 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.177404 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:17:34.177866 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.177474 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:17:34.177866 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.177528 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:17:34.177866 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:34.177614 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:34.177866 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:34.177617 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:34.177866 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:34.177662 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls podName:e33c04f3-e414-4174-8047-0f84ece6cd5d nodeName:}" failed. No retries permitted until 2026-04-16 18:18:06.177647678 +0000 UTC m=+97.263097052 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls") pod "dns-default-bz65j" (UID: "e33c04f3-e414-4174-8047-0f84ece6cd5d") : secret "dns-default-metrics-tls" not found Apr 16 18:17:34.177866 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:34.177675 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert podName:cea240e4-3f15-45d5-a754-105ae5e43a47 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:06.177668373 +0000 UTC m=+97.263117745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert") pod "ingress-canary-646jh" (UID: "cea240e4-3f15-45d5-a754-105ae5e43a47") : secret "canary-serving-cert" not found Apr 16 18:17:34.180489 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.180468 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:17:34.188454 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:34.188432 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:17:34.188535 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:17:34.188491 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:38.188477247 +0000 UTC m=+129.273926616 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : secret "metrics-daemon-secret" not found Apr 16 18:17:34.278723 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.278687 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:17:34.281486 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.281463 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:17:34.292262 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.292229 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:17:34.302992 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.302961 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z2ld\" (UniqueName: \"kubernetes.io/projected/167f8a10-92f4-444e-912f-415dafc03e58-kube-api-access-8z2ld\") pod \"network-check-target-7r5l5\" (UID: \"167f8a10-92f4-444e-912f-415dafc03e58\") " pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:17:34.337246 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.337216 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p8pr6\"" Apr 16 18:17:34.345695 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.345662 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:17:34.480476 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.480445 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7r5l5"] Apr 16 18:17:34.483880 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:17:34.483853 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod167f8a10_92f4_444e_912f_415dafc03e58.slice/crio-801692ff4d4730073dcb6105bd60e9608503a6b95541fbe2225863c526d9510a WatchSource:0}: Error finding container 801692ff4d4730073dcb6105bd60e9608503a6b95541fbe2225863c526d9510a: Status 404 returned error can't find the container with id 801692ff4d4730073dcb6105bd60e9608503a6b95541fbe2225863c526d9510a Apr 16 18:17:34.925508 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:34.925466 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7r5l5" event={"ID":"167f8a10-92f4-444e-912f-415dafc03e58","Type":"ContainerStarted","Data":"801692ff4d4730073dcb6105bd60e9608503a6b95541fbe2225863c526d9510a"} Apr 16 18:17:37.933287 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:37.933249 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7r5l5" event={"ID":"167f8a10-92f4-444e-912f-415dafc03e58","Type":"ContainerStarted","Data":"9ce43392159c73c4a971f7ebac1cff69b83f985a2168ce9348e034c2a85dfab6"} Apr 16 18:17:37.933726 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:37.933374 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:17:37.950697 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:17:37.950626 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7r5l5" podStartSLOduration=66.156230819 podStartE2EDuration="1m8.95060839s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:17:34.485723343 +0000 UTC m=+65.571172712" lastFinishedPulling="2026-04-16 18:17:37.280100895 +0000 UTC m=+68.365550283" observedRunningTime="2026-04-16 18:17:37.949938587 +0000 UTC m=+69.035387977" watchObservedRunningTime="2026-04-16 18:17:37.95060839 +0000 UTC m=+69.036057782" Apr 16 18:18:06.198987 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:06.198857 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:18:06.198987 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:06.198934 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:18:06.199488 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:06.199008 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:06.199488 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:06.199022 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:06.199488 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:06.199087 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert podName:cea240e4-3f15-45d5-a754-105ae5e43a47 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:10.199067158 +0000 UTC m=+161.284516541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert") pod "ingress-canary-646jh" (UID: "cea240e4-3f15-45d5-a754-105ae5e43a47") : secret "canary-serving-cert" not found Apr 16 18:18:06.199488 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:06.199099 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls podName:e33c04f3-e414-4174-8047-0f84ece6cd5d nodeName:}" failed. No retries permitted until 2026-04-16 18:19:10.199093911 +0000 UTC m=+161.284543283 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls") pod "dns-default-bz65j" (UID: "e33c04f3-e414-4174-8047-0f84ece6cd5d") : secret "dns-default-metrics-tls" not found Apr 16 18:18:08.936995 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:08.936962 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7r5l5" Apr 16 18:18:38.216073 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:38.216016 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:18:38.216609 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:38.216134 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:38.216609 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:38.216228 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs podName:7682219f-20c8-40ee-a84d-c68d79df1dd8 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:40.216212186 +0000 UTC m=+251.301661561 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs") pod "network-metrics-daemon-4vgjf" (UID: "7682219f-20c8-40ee-a84d-c68d79df1dd8") : secret "metrics-daemon-secret" not found Apr 16 18:18:43.527224 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.527175 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds"] Apr 16 18:18:43.529916 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.529901 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds" Apr 16 18:18:43.533773 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.533755 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-gk84z\"" Apr 16 18:18:43.534973 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.534946 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.535077 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.534957 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.535969 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.535887 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-75585f59db-85qwd"] Apr 16 18:18:43.538469 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.538451 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.540818 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.540799 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:18:43.540908 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.540820 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:18:43.540955 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.540920 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qqz6d\"" Apr 16 18:18:43.541600 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.541585 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:18:43.543857 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.543836 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds"] Apr 16 18:18:43.547390 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.547370 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:18:43.572302 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.572264 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75585f59db-85qwd"] Apr 16 18:18:43.632257 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.632228 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r"] Apr 16 18:18:43.635066 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.635044 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.636352 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.636321 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7"] Apr 16 18:18:43.638889 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.638874 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.639454 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.639435 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.639549 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.639504 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:18:43.639610 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.639549 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.639610 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.639555 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-xvfdf\"" Apr 16 18:18:43.640372 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.640355 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:18:43.642651 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.642632 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:18:43.642738 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.642669 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:18:43.642810 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.642792 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-skhm8\"" Apr 16 18:18:43.642905 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.642885 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.643364 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.643347 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.646803 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.646783 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r"] Apr 16 18:18:43.650843 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.650817 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7"] Apr 16 18:18:43.654336 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654313 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-registry-certificates\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.654438 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654366 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-trusted-ca\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.654438 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654384 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-bound-sa-token\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.654438 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654431 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdfw\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-kube-api-access-dkdfw\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.654635 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654448 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.654635 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654463 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-installation-pull-secrets\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.654635 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654484 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55883b38-898b-4426-9e22-f96a487c90c6-ca-trust-extracted\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.654635 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654502 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-image-registry-private-configuration\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.654635 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.654546 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkc7z\" (UniqueName: \"kubernetes.io/projected/9292231d-6f76-4890-98d2-105390472ac1-kube-api-access-hkc7z\") pod \"volume-data-source-validator-7d955d5dd4-9q4ds\" (UID: \"9292231d-6f76-4890-98d2-105390472ac1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds" Apr 16 18:18:43.755694 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.755655 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac44aa61-86ed-41f4-aab0-bbabab9224b1-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.755865 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.755716 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdfw\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-kube-api-access-dkdfw\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.755865 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.755762 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.755865 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:43.755838 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:43.755865 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:43.755852 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75585f59db-85qwd: secret "image-registry-tls" not found Apr 16 18:18:43.756013 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.755871 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-installation-pull-secrets\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.756013 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:43.755909 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls podName:55883b38-898b-4426-9e22-f96a487c90c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:44.255887334 +0000 UTC m=+135.341336703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls") pod "image-registry-75585f59db-85qwd" (UID: "55883b38-898b-4426-9e22-f96a487c90c6") : secret "image-registry-tls" not found Apr 16 18:18:43.756013 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.755940 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55883b38-898b-4426-9e22-f96a487c90c6-ca-trust-extracted\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.756013 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.755970 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-image-registry-private-configuration\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.756013 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756004 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f57c457-af5c-41b7-b405-ac38ae8bd95a-serving-cert\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.756314 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756032 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkc7z\" (UniqueName: \"kubernetes.io/projected/9292231d-6f76-4890-98d2-105390472ac1-kube-api-access-hkc7z\") pod \"volume-data-source-validator-7d955d5dd4-9q4ds\" (UID: \"9292231d-6f76-4890-98d2-105390472ac1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds" Apr 16 18:18:43.756314 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756060 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrsp\" (UniqueName: \"kubernetes.io/projected/6f57c457-af5c-41b7-b405-ac38ae8bd95a-kube-api-access-vnrsp\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.756314 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756112 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-registry-certificates\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.756314 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756183 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac44aa61-86ed-41f4-aab0-bbabab9224b1-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.756314 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756244 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwn29\" (UniqueName: \"kubernetes.io/projected/ac44aa61-86ed-41f4-aab0-bbabab9224b1-kube-api-access-gwn29\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.756314 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756293 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f57c457-af5c-41b7-b405-ac38ae8bd95a-config\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.756564 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756320 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-trusted-ca\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.756564 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756364 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-bound-sa-token\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.756564 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756456 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55883b38-898b-4426-9e22-f96a487c90c6-ca-trust-extracted\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.756753 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.756734 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-registry-certificates\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.757277 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.757258 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-trusted-ca\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.758403 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.758379 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-installation-pull-secrets\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.758491 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.758458 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-image-registry-private-configuration\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.769021 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.768999 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkc7z\" (UniqueName: \"kubernetes.io/projected/9292231d-6f76-4890-98d2-105390472ac1-kube-api-access-hkc7z\") pod \"volume-data-source-validator-7d955d5dd4-9q4ds\" (UID: \"9292231d-6f76-4890-98d2-105390472ac1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds" Apr 16 18:18:43.769633 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.769611 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdfw\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-kube-api-access-dkdfw\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.776474 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.776453 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-bound-sa-token\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:43.838754 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.838655 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds" Apr 16 18:18:43.856708 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.856679 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f57c457-af5c-41b7-b405-ac38ae8bd95a-serving-cert\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.856835 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.856723 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrsp\" (UniqueName: \"kubernetes.io/projected/6f57c457-af5c-41b7-b405-ac38ae8bd95a-kube-api-access-vnrsp\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.856835 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.856764 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac44aa61-86ed-41f4-aab0-bbabab9224b1-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.856835 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.856789 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwn29\" (UniqueName: \"kubernetes.io/projected/ac44aa61-86ed-41f4-aab0-bbabab9224b1-kube-api-access-gwn29\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.857042 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.857010 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f57c457-af5c-41b7-b405-ac38ae8bd95a-config\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.857165 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.857069 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac44aa61-86ed-41f4-aab0-bbabab9224b1-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.857538 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.857519 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f57c457-af5c-41b7-b405-ac38ae8bd95a-config\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.858065 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.858040 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac44aa61-86ed-41f4-aab0-bbabab9224b1-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.859108 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.859085 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f57c457-af5c-41b7-b405-ac38ae8bd95a-serving-cert\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.859655 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.859636 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac44aa61-86ed-41f4-aab0-bbabab9224b1-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.865132 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.865110 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrsp\" (UniqueName: \"kubernetes.io/projected/6f57c457-af5c-41b7-b405-ac38ae8bd95a-kube-api-access-vnrsp\") pod \"service-ca-operator-69965bb79d-72hl7\" (UID: \"6f57c457-af5c-41b7-b405-ac38ae8bd95a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.865461 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.865445 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwn29\" (UniqueName: \"kubernetes.io/projected/ac44aa61-86ed-41f4-aab0-bbabab9224b1-kube-api-access-gwn29\") pod \"kube-storage-version-migrator-operator-756bb7d76f-dwl9r\" (UID: \"ac44aa61-86ed-41f4-aab0-bbabab9224b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.945664 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.945629 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" Apr 16 18:18:43.951109 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.951079 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" Apr 16 18:18:43.952001 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:43.951926 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds"] Apr 16 18:18:43.957021 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:18:43.956989 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9292231d_6f76_4890_98d2_105390472ac1.slice/crio-063d518b53d3670b4c7903c8397eac86ecd2e48148c4369f84afe0aff26a8e17 WatchSource:0}: Error finding container 063d518b53d3670b4c7903c8397eac86ecd2e48148c4369f84afe0aff26a8e17: Status 404 returned error can't find the container with id 063d518b53d3670b4c7903c8397eac86ecd2e48148c4369f84afe0aff26a8e17 Apr 16 18:18:44.060980 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:44.060946 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds" event={"ID":"9292231d-6f76-4890-98d2-105390472ac1","Type":"ContainerStarted","Data":"063d518b53d3670b4c7903c8397eac86ecd2e48148c4369f84afe0aff26a8e17"} Apr 16 18:18:44.073071 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:44.073046 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r"] Apr 16 18:18:44.074106 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:18:44.074077 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac44aa61_86ed_41f4_aab0_bbabab9224b1.slice/crio-5508c9bd1e9afd154689cdbcab2ff1fd0dcc5bad5f6bed4a3fb43e2e2265447d WatchSource:0}: Error finding container 5508c9bd1e9afd154689cdbcab2ff1fd0dcc5bad5f6bed4a3fb43e2e2265447d: Status 404 returned error can't find the container with id 5508c9bd1e9afd154689cdbcab2ff1fd0dcc5bad5f6bed4a3fb43e2e2265447d Apr 16 18:18:44.087687 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:44.087657 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7"] Apr 16 18:18:44.090123 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:18:44.090058 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f57c457_af5c_41b7_b405_ac38ae8bd95a.slice/crio-f014f529c1649a1936cfda6b095c299c963da8868fd0ee9d0ac685320ffa5717 WatchSource:0}: Error finding container f014f529c1649a1936cfda6b095c299c963da8868fd0ee9d0ac685320ffa5717: Status 404 returned error can't find the container with id f014f529c1649a1936cfda6b095c299c963da8868fd0ee9d0ac685320ffa5717 Apr 16 18:18:44.261245 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:44.261209 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:44.261417 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:44.261356 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:44.261417 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:44.261377 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75585f59db-85qwd: secret "image-registry-tls" not found Apr 16 18:18:44.261490 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:44.261458 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls podName:55883b38-898b-4426-9e22-f96a487c90c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:45.261440409 +0000 UTC m=+136.346889782 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls") pod "image-registry-75585f59db-85qwd" (UID: "55883b38-898b-4426-9e22-f96a487c90c6") : secret "image-registry-tls" not found Apr 16 18:18:45.064835 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:45.064792 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" event={"ID":"ac44aa61-86ed-41f4-aab0-bbabab9224b1","Type":"ContainerStarted","Data":"5508c9bd1e9afd154689cdbcab2ff1fd0dcc5bad5f6bed4a3fb43e2e2265447d"} Apr 16 18:18:45.066139 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:45.066110 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" event={"ID":"6f57c457-af5c-41b7-b405-ac38ae8bd95a","Type":"ContainerStarted","Data":"f014f529c1649a1936cfda6b095c299c963da8868fd0ee9d0ac685320ffa5717"} Apr 16 18:18:45.271259 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:45.271216 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:45.271436 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:45.271384 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:45.271436 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:45.271402 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75585f59db-85qwd: secret "image-registry-tls" not found Apr 16 18:18:45.271556 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:45.271467 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls podName:55883b38-898b-4426-9e22-f96a487c90c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:47.271444816 +0000 UTC m=+138.356894199 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls") pod "image-registry-75585f59db-85qwd" (UID: "55883b38-898b-4426-9e22-f96a487c90c6") : secret "image-registry-tls" not found Apr 16 18:18:46.069276 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:46.069245 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds" event={"ID":"9292231d-6f76-4890-98d2-105390472ac1","Type":"ContainerStarted","Data":"799ce6b5446eb10dcde12ee23949731ad83e9fe8004560996e2a4729517f3bcb"} Apr 16 18:18:46.085277 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:46.085102 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-9q4ds" podStartSLOduration=1.867280857 podStartE2EDuration="3.085083963s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:18:43.95949758 +0000 UTC m=+135.044946950" lastFinishedPulling="2026-04-16 18:18:45.17730068 +0000 UTC m=+136.262750056" observedRunningTime="2026-04-16 18:18:46.084675372 +0000 UTC m=+137.170124765" watchObservedRunningTime="2026-04-16 18:18:46.085083963 +0000 UTC m=+137.170533355" Apr 16 18:18:47.072927 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:47.072891 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" event={"ID":"6f57c457-af5c-41b7-b405-ac38ae8bd95a","Type":"ContainerStarted","Data":"ff9ddf4ddbc7cb03154eca9d48b27cf6020b31b12ee438a1107dc8a48705c6c0"} Apr 16 18:18:47.074343 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:47.074315 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" event={"ID":"ac44aa61-86ed-41f4-aab0-bbabab9224b1","Type":"ContainerStarted","Data":"34a875e4b3ca064323135ad75840d09046136ff75ba6132aa7ad8359c519a8a1"} Apr 16 18:18:47.091040 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:47.090997 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" podStartSLOduration=1.586688243 podStartE2EDuration="4.09098228s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:18:44.091935184 +0000 UTC m=+135.177384556" lastFinishedPulling="2026-04-16 18:18:46.596229224 +0000 UTC m=+137.681678593" observedRunningTime="2026-04-16 18:18:47.09041914 +0000 UTC m=+138.175868533" watchObservedRunningTime="2026-04-16 18:18:47.09098228 +0000 UTC m=+138.176431683" Apr 16 18:18:47.107285 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:47.107230 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" podStartSLOduration=1.590203719 podStartE2EDuration="4.107212104s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:18:44.07611806 +0000 UTC m=+135.161567429" lastFinishedPulling="2026-04-16 18:18:46.593126432 +0000 UTC m=+137.678575814" observedRunningTime="2026-04-16 18:18:47.107104607 +0000 UTC m=+138.192554001" watchObservedRunningTime="2026-04-16 18:18:47.107212104 +0000 UTC m=+138.192661495" Apr 16 18:18:47.287725 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:47.287687 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:47.287893 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:47.287811 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:47.287893 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:47.287823 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75585f59db-85qwd: secret "image-registry-tls" not found Apr 16 18:18:47.287893 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:47.287868 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls podName:55883b38-898b-4426-9e22-f96a487c90c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:51.287855448 +0000 UTC m=+142.373304821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls") pod "image-registry-75585f59db-85qwd" (UID: "55883b38-898b-4426-9e22-f96a487c90c6") : secret "image-registry-tls" not found Apr 16 18:18:48.249089 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.249054 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw"] Apr 16 18:18:48.251931 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.251914 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" Apr 16 18:18:48.254514 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.254481 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-f2m9b\"" Apr 16 18:18:48.254762 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.254747 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:18:48.255489 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.255475 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:18:48.261996 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.261964 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw"] Apr 16 18:18:48.402535 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.402494 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dptr7\" (UniqueName: \"kubernetes.io/projected/075eb149-d2bc-4930-aa0f-185e8cc92d22-kube-api-access-dptr7\") pod \"migrator-64d4d94569-6wlxw\" (UID: \"075eb149-d2bc-4930-aa0f-185e8cc92d22\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" Apr 16 18:18:48.503652 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.503570 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dptr7\" (UniqueName: \"kubernetes.io/projected/075eb149-d2bc-4930-aa0f-185e8cc92d22-kube-api-access-dptr7\") pod \"migrator-64d4d94569-6wlxw\" (UID: \"075eb149-d2bc-4930-aa0f-185e8cc92d22\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" Apr 16 18:18:48.518085 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.518047 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dptr7\" (UniqueName: \"kubernetes.io/projected/075eb149-d2bc-4930-aa0f-185e8cc92d22-kube-api-access-dptr7\") pod \"migrator-64d4d94569-6wlxw\" (UID: \"075eb149-d2bc-4930-aa0f-185e8cc92d22\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" Apr 16 18:18:48.561382 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.561343 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" Apr 16 18:18:48.682974 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:48.682941 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw"] Apr 16 18:18:48.687034 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:18:48.687001 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod075eb149_d2bc_4930_aa0f_185e8cc92d22.slice/crio-97af7539e50e3e6fc6f3e3354d5728086112c60eda3fb7d2b72a223b4ec499f3 WatchSource:0}: Error finding container 97af7539e50e3e6fc6f3e3354d5728086112c60eda3fb7d2b72a223b4ec499f3: Status 404 returned error can't find the container with id 97af7539e50e3e6fc6f3e3354d5728086112c60eda3fb7d2b72a223b4ec499f3 Apr 16 18:18:49.080645 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:49.080605 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" event={"ID":"075eb149-d2bc-4930-aa0f-185e8cc92d22","Type":"ContainerStarted","Data":"97af7539e50e3e6fc6f3e3354d5728086112c60eda3fb7d2b72a223b4ec499f3"} Apr 16 18:18:49.192092 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:49.192063 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8spcs_1ac43e44-6bc5-4678-9022-029aed19a8c1/dns-node-resolver/0.log" Apr 16 18:18:50.087936 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:50.087890 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" event={"ID":"075eb149-d2bc-4930-aa0f-185e8cc92d22","Type":"ContainerStarted","Data":"e53a79a6635acf968a30d338f2840c7e88eb4a7b13e9a8a574b33ace12cac897"} Apr 16 18:18:50.087936 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:50.087938 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" event={"ID":"075eb149-d2bc-4930-aa0f-185e8cc92d22","Type":"ContainerStarted","Data":"ed28987cf9f4c33e3e6dcd835e413499ace5e36feeb7eccad5223f39ccc8063c"} Apr 16 18:18:50.104865 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:50.104792 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6wlxw" podStartSLOduration=0.926335534 podStartE2EDuration="2.104778648s" podCreationTimestamp="2026-04-16 18:18:48 +0000 UTC" firstStartedPulling="2026-04-16 18:18:48.688913681 +0000 UTC m=+139.774363050" lastFinishedPulling="2026-04-16 18:18:49.867356793 +0000 UTC m=+140.952806164" observedRunningTime="2026-04-16 18:18:50.103996137 +0000 UTC m=+141.189445529" watchObservedRunningTime="2026-04-16 18:18:50.104778648 +0000 UTC m=+141.190228039" Apr 16 18:18:50.197338 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:50.197314 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5m9b6_d4f7966d-78bf-4cbb-a764-b066fe69e484/node-ca/0.log" Apr 16 18:18:51.325697 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:51.325658 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:51.326053 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:51.325778 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:51.326053 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:51.325790 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75585f59db-85qwd: secret "image-registry-tls" not found Apr 16 18:18:51.326053 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:18:51.325842 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls podName:55883b38-898b-4426-9e22-f96a487c90c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:59.325826846 +0000 UTC m=+150.411276218 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls") pod "image-registry-75585f59db-85qwd" (UID: "55883b38-898b-4426-9e22-f96a487c90c6") : secret "image-registry-tls" not found Apr 16 18:18:59.388609 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:59.388575 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:59.390967 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:59.390939 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"image-registry-75585f59db-85qwd\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:59.447049 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:59.447011 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:18:59.576644 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:18:59.576615 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75585f59db-85qwd"] Apr 16 18:18:59.580947 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:18:59.580908 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55883b38_898b_4426_9e22_f96a487c90c6.slice/crio-d8af525dddc2309cbe14f5d7787881d1db64ed3836b6d72423d1f991b771cfdc WatchSource:0}: Error finding container d8af525dddc2309cbe14f5d7787881d1db64ed3836b6d72423d1f991b771cfdc: Status 404 returned error can't find the container with id d8af525dddc2309cbe14f5d7787881d1db64ed3836b6d72423d1f991b771cfdc Apr 16 18:19:00.113025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:00.112992 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75585f59db-85qwd" event={"ID":"55883b38-898b-4426-9e22-f96a487c90c6","Type":"ContainerStarted","Data":"66eb5f33f14ef8938585f052f792a2d9d1e3f816246da7c34d126ad45f15d431"} Apr 16 18:19:00.113025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:00.113027 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75585f59db-85qwd" event={"ID":"55883b38-898b-4426-9e22-f96a487c90c6","Type":"ContainerStarted","Data":"d8af525dddc2309cbe14f5d7787881d1db64ed3836b6d72423d1f991b771cfdc"} Apr 16 18:19:00.113273 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:00.113145 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:19:05.301132 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:19:05.301086 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bz65j" podUID="e33c04f3-e414-4174-8047-0f84ece6cd5d" Apr 16 18:19:05.329561 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:19:05.329526 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-646jh" podUID="cea240e4-3f15-45d5-a754-105ae5e43a47" Apr 16 18:19:05.541236 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:19:05.541176 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4vgjf" podUID="7682219f-20c8-40ee-a84d-c68d79df1dd8" Apr 16 18:19:06.127671 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:06.127639 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bz65j" Apr 16 18:19:06.127852 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:06.127639 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:19:10.265020 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.264981 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:19:10.265538 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.265032 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:19:10.267395 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.267374 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e33c04f3-e414-4174-8047-0f84ece6cd5d-metrics-tls\") pod \"dns-default-bz65j\" (UID: \"e33c04f3-e414-4174-8047-0f84ece6cd5d\") " pod="openshift-dns/dns-default-bz65j" Apr 16 18:19:10.267493 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.267419 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea240e4-3f15-45d5-a754-105ae5e43a47-cert\") pod \"ingress-canary-646jh\" (UID: \"cea240e4-3f15-45d5-a754-105ae5e43a47\") " pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:19:10.331205 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.331158 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4b92n\"" Apr 16 18:19:10.332117 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.332102 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz4g7\"" Apr 16 18:19:10.338930 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.338908 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-646jh" Apr 16 18:19:10.339040 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.339002 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bz65j" Apr 16 18:19:10.465164 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.465094 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-75585f59db-85qwd" podStartSLOduration=27.465074094 podStartE2EDuration="27.465074094s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:00.144704779 +0000 UTC m=+151.230154170" watchObservedRunningTime="2026-04-16 18:19:10.465074094 +0000 UTC m=+161.550523486" Apr 16 18:19:10.466239 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.466148 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bz65j"] Apr 16 18:19:10.468855 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:10.468828 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode33c04f3_e414_4174_8047_0f84ece6cd5d.slice/crio-240a9a9aa5f1ee6be644a84ba72712b172f0eb19c6c981f53daf35df05809e07 WatchSource:0}: Error finding container 240a9a9aa5f1ee6be644a84ba72712b172f0eb19c6c981f53daf35df05809e07: Status 404 returned error can't find the container with id 240a9a9aa5f1ee6be644a84ba72712b172f0eb19c6c981f53daf35df05809e07 Apr 16 18:19:10.478456 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:10.478434 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-646jh"] Apr 16 18:19:10.481127 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:10.481100 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea240e4_3f15_45d5_a754_105ae5e43a47.slice/crio-c40990bb04c89a1407cb7fff5c536caa98dcbe2d8d0b65afb9f37a7fbfce2990 WatchSource:0}: Error finding container c40990bb04c89a1407cb7fff5c536caa98dcbe2d8d0b65afb9f37a7fbfce2990: Status 404 returned error can't find the container with id c40990bb04c89a1407cb7fff5c536caa98dcbe2d8d0b65afb9f37a7fbfce2990 Apr 16 18:19:11.140575 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:11.140520 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-646jh" event={"ID":"cea240e4-3f15-45d5-a754-105ae5e43a47","Type":"ContainerStarted","Data":"c40990bb04c89a1407cb7fff5c536caa98dcbe2d8d0b65afb9f37a7fbfce2990"} Apr 16 18:19:11.141909 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:11.141873 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bz65j" event={"ID":"e33c04f3-e414-4174-8047-0f84ece6cd5d","Type":"ContainerStarted","Data":"240a9a9aa5f1ee6be644a84ba72712b172f0eb19c6c981f53daf35df05809e07"} Apr 16 18:19:12.047833 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.047795 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dn8ld"] Apr 16 18:19:12.051224 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.051178 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.054091 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.054067 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:19:12.055092 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.054856 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pnvp2\"" Apr 16 18:19:12.055092 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.054925 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:19:12.055092 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.054907 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:19:12.055092 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.054984 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:19:12.064076 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.064051 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dn8ld"] Apr 16 18:19:12.077968 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.077939 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7b335442-4810-4f3b-a541-31865a746c8b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.078143 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.077974 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7b335442-4810-4f3b-a541-31865a746c8b-crio-socket\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.078143 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.078043 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7b335442-4810-4f3b-a541-31865a746c8b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.078143 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.078073 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b335442-4810-4f3b-a541-31865a746c8b-data-volume\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.078143 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.078107 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdhjz\" (UniqueName: \"kubernetes.io/projected/7b335442-4810-4f3b-a541-31865a746c8b-kube-api-access-mdhjz\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.104607 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.104564 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-75585f59db-85qwd"] Apr 16 18:19:12.138892 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.138860 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6994467cc7-df69f"] Apr 16 18:19:12.141865 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.141844 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.155259 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.155225 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6994467cc7-df69f"] Apr 16 18:19:12.178639 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178599 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2f036a5-fee5-49c7-9a46-fd0ccba53895-trusted-ca\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.178811 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178653 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7b335442-4810-4f3b-a541-31865a746c8b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.178811 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178687 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7b335442-4810-4f3b-a541-31865a746c8b-crio-socket\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.178811 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178727 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2f036a5-fee5-49c7-9a46-fd0ccba53895-ca-trust-extracted\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.178811 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178783 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl29j\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-kube-api-access-fl29j\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.179022 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178796 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7b335442-4810-4f3b-a541-31865a746c8b-crio-socket\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.179022 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178837 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7b335442-4810-4f3b-a541-31865a746c8b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.179022 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178920 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-bound-sa-token\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.179022 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178954 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2f036a5-fee5-49c7-9a46-fd0ccba53895-installation-pull-secrets\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.179022 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.178987 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b335442-4810-4f3b-a541-31865a746c8b-data-volume\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.179022 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.179015 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdhjz\" (UniqueName: \"kubernetes.io/projected/7b335442-4810-4f3b-a541-31865a746c8b-kube-api-access-mdhjz\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.179287 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.179066 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-registry-tls\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.179287 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.179097 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2f036a5-fee5-49c7-9a46-fd0ccba53895-registry-certificates\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.179287 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.179126 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2f036a5-fee5-49c7-9a46-fd0ccba53895-image-registry-private-configuration\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.179287 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.179270 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b335442-4810-4f3b-a541-31865a746c8b-data-volume\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.179479 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.179270 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7b335442-4810-4f3b-a541-31865a746c8b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.181509 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.181484 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7b335442-4810-4f3b-a541-31865a746c8b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.192267 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.192234 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdhjz\" (UniqueName: \"kubernetes.io/projected/7b335442-4810-4f3b-a541-31865a746c8b-kube-api-access-mdhjz\") pod \"insights-runtime-extractor-dn8ld\" (UID: \"7b335442-4810-4f3b-a541-31865a746c8b\") " pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.279706 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.279671 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl29j\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-kube-api-access-fl29j\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.279891 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.279721 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-bound-sa-token\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.279891 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.279756 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2f036a5-fee5-49c7-9a46-fd0ccba53895-installation-pull-secrets\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.279891 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.279804 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-registry-tls\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.280089 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.279997 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2f036a5-fee5-49c7-9a46-fd0ccba53895-registry-certificates\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.280089 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.280043 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2f036a5-fee5-49c7-9a46-fd0ccba53895-image-registry-private-configuration\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.280225 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.280130 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2f036a5-fee5-49c7-9a46-fd0ccba53895-trusted-ca\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.280291 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.280225 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2f036a5-fee5-49c7-9a46-fd0ccba53895-ca-trust-extracted\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.280613 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.280568 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2f036a5-fee5-49c7-9a46-fd0ccba53895-ca-trust-extracted\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.280940 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.280915 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2f036a5-fee5-49c7-9a46-fd0ccba53895-registry-certificates\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.281243 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.281217 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2f036a5-fee5-49c7-9a46-fd0ccba53895-trusted-ca\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.282632 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.282586 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2f036a5-fee5-49c7-9a46-fd0ccba53895-image-registry-private-configuration\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.282729 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.282671 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-registry-tls\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.282729 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.282700 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2f036a5-fee5-49c7-9a46-fd0ccba53895-installation-pull-secrets\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.291986 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.291962 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-bound-sa-token\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.292258 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.292241 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl29j\" (UniqueName: \"kubernetes.io/projected/e2f036a5-fee5-49c7-9a46-fd0ccba53895-kube-api-access-fl29j\") pod \"image-registry-6994467cc7-df69f\" (UID: \"e2f036a5-fee5-49c7-9a46-fd0ccba53895\") " pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.363109 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.363025 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dn8ld" Apr 16 18:19:12.451681 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.451659 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:12.615470 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.615449 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dn8ld"] Apr 16 18:19:12.619334 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:12.619309 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b335442_4810_4f3b_a541_31865a746c8b.slice/crio-37b4bf66688c584a8a486f6cb3273e3a712a1977e6f8b4bb749120fc21aadee5 WatchSource:0}: Error finding container 37b4bf66688c584a8a486f6cb3273e3a712a1977e6f8b4bb749120fc21aadee5: Status 404 returned error can't find the container with id 37b4bf66688c584a8a486f6cb3273e3a712a1977e6f8b4bb749120fc21aadee5 Apr 16 18:19:12.629777 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:12.629707 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6994467cc7-df69f"] Apr 16 18:19:12.637499 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:12.637251 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2f036a5_fee5_49c7_9a46_fd0ccba53895.slice/crio-8bbe7f9d77a43135d3df331b56ece0596daa06e20a4aa4490604dff7504b9859 WatchSource:0}: Error finding container 8bbe7f9d77a43135d3df331b56ece0596daa06e20a4aa4490604dff7504b9859: Status 404 returned error can't find the container with id 8bbe7f9d77a43135d3df331b56ece0596daa06e20a4aa4490604dff7504b9859 Apr 16 18:19:13.148665 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.148622 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6994467cc7-df69f" event={"ID":"e2f036a5-fee5-49c7-9a46-fd0ccba53895","Type":"ContainerStarted","Data":"454b47351316240f6f400518c23aecdbd492549d0a8efbd9c00c9076b3383cda"} Apr 16 18:19:13.149118 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.148672 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6994467cc7-df69f" event={"ID":"e2f036a5-fee5-49c7-9a46-fd0ccba53895","Type":"ContainerStarted","Data":"8bbe7f9d77a43135d3df331b56ece0596daa06e20a4aa4490604dff7504b9859"} Apr 16 18:19:13.149118 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.148748 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:13.150072 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.150046 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-646jh" event={"ID":"cea240e4-3f15-45d5-a754-105ae5e43a47","Type":"ContainerStarted","Data":"40320827cb3b444c9fbca47c12bd6b24e1cf84164a026158aad60ba61ed26484"} Apr 16 18:19:13.151788 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.151762 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bz65j" event={"ID":"e33c04f3-e414-4174-8047-0f84ece6cd5d","Type":"ContainerStarted","Data":"7d4c60f9dcc1f0a0a1db4ddaed1c359d8e4d3c92a7a7315d856c08a4bcd79543"} Apr 16 18:19:13.151910 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.151795 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bz65j" event={"ID":"e33c04f3-e414-4174-8047-0f84ece6cd5d","Type":"ContainerStarted","Data":"1ca0dadb851fd07f1f77437ddb14d07c56f40b69c8345ad7219239da477921a5"} Apr 16 18:19:13.151973 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.151919 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bz65j" Apr 16 18:19:13.153089 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.153071 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dn8ld" event={"ID":"7b335442-4810-4f3b-a541-31865a746c8b","Type":"ContainerStarted","Data":"c0f912a76813d8996f620b6509ea82fb27f3c9881c2c552c6fecdd39fb0805b5"} Apr 16 18:19:13.153169 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.153093 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dn8ld" event={"ID":"7b335442-4810-4f3b-a541-31865a746c8b","Type":"ContainerStarted","Data":"37b4bf66688c584a8a486f6cb3273e3a712a1977e6f8b4bb749120fc21aadee5"} Apr 16 18:19:13.167438 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.167369 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6994467cc7-df69f" podStartSLOduration=1.167353369 podStartE2EDuration="1.167353369s" podCreationTimestamp="2026-04-16 18:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:13.166889265 +0000 UTC m=+164.252338657" watchObservedRunningTime="2026-04-16 18:19:13.167353369 +0000 UTC m=+164.252802757" Apr 16 18:19:13.182366 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.182317 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bz65j" podStartSLOduration=129.214547528 podStartE2EDuration="2m11.182300334s" podCreationTimestamp="2026-04-16 18:17:02 +0000 UTC" firstStartedPulling="2026-04-16 18:19:10.47069493 +0000 UTC m=+161.556144300" lastFinishedPulling="2026-04-16 18:19:12.438447723 +0000 UTC m=+163.523897106" observedRunningTime="2026-04-16 18:19:13.181483514 +0000 UTC m=+164.266932907" watchObservedRunningTime="2026-04-16 18:19:13.182300334 +0000 UTC m=+164.267749726" Apr 16 18:19:13.197004 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:13.196959 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-646jh" podStartSLOduration=129.235433443 podStartE2EDuration="2m11.19694686s" podCreationTimestamp="2026-04-16 18:17:02 +0000 UTC" firstStartedPulling="2026-04-16 18:19:10.482928967 +0000 UTC m=+161.568378336" lastFinishedPulling="2026-04-16 18:19:12.444442384 +0000 UTC m=+163.529891753" observedRunningTime="2026-04-16 18:19:13.195671215 +0000 UTC m=+164.281120639" watchObservedRunningTime="2026-04-16 18:19:13.19694686 +0000 UTC m=+164.282396312" Apr 16 18:19:14.158781 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:14.158742 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dn8ld" event={"ID":"7b335442-4810-4f3b-a541-31865a746c8b","Type":"ContainerStarted","Data":"0bec106d9120a5a2de3582227e5cab5005d2f1ada8797893a447b4ec81aea3b3"} Apr 16 18:19:15.163363 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:15.163330 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dn8ld" event={"ID":"7b335442-4810-4f3b-a541-31865a746c8b","Type":"ContainerStarted","Data":"64cbfee11e6ef400a614228fae1e046956fe16b25f21af10e051cf71470d2328"} Apr 16 18:19:15.181766 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:15.181722 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dn8ld" podStartSLOduration=1.2480481700000001 podStartE2EDuration="3.181705414s" podCreationTimestamp="2026-04-16 18:19:12 +0000 UTC" firstStartedPulling="2026-04-16 18:19:12.705097688 +0000 UTC m=+163.790547072" lastFinishedPulling="2026-04-16 18:19:14.638754946 +0000 UTC m=+165.724204316" observedRunningTime="2026-04-16 18:19:15.18135618 +0000 UTC m=+166.266805571" watchObservedRunningTime="2026-04-16 18:19:15.181705414 +0000 UTC m=+166.267154805" Apr 16 18:19:18.519345 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:18.519256 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:19:22.109961 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:22.109925 2562 patch_prober.go:28] interesting pod/image-registry-75585f59db-85qwd container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:19:22.110371 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:22.109989 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-75585f59db-85qwd" podUID="55883b38-898b-4426-9e22-f96a487c90c6" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:19:23.160962 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:23.160935 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bz65j" Apr 16 18:19:25.411063 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.411025 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-b5w4l"] Apr 16 18:19:25.415794 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.415766 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.421822 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.421796 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:19:25.423263 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.423236 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-5h6ps"] Apr 16 18:19:25.426394 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.426379 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.432970 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.432915 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:19:25.433301 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.433281 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:19:25.433488 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.433285 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:19:25.433488 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.433319 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:19:25.433654 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.433353 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:19:25.434722 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.434702 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:19:25.434834 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.434736 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:19:25.434834 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.434751 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4sncj\"" Apr 16 18:19:25.434834 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.434813 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:19:25.434834 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.434756 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gmsbp\"" Apr 16 18:19:25.455906 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.455872 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-5h6ps"] Apr 16 18:19:25.475816 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.475782 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.475816 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.475819 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.476088 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.475841 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55debf3d-042d-4ec7-811c-39c4ba0d540d-metrics-client-ca\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.476088 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.475857 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-sys\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.476088 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.475909 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-wtmp\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.476088 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.475977 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc6lc\" (UniqueName: \"kubernetes.io/projected/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-api-access-dc6lc\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.476088 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476006 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.476088 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476053 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-tls\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.476088 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476074 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-textfile\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.476460 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476119 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.476460 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476170 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-root\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.476460 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476239 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6khvc\" (UniqueName: \"kubernetes.io/projected/55debf3d-042d-4ec7-811c-39c4ba0d540d-kube-api-access-6khvc\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.476460 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476304 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76362cf3-2b44-437b-bd99-b7048c4e3aa6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.476460 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476345 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76362cf3-2b44-437b-bd99-b7048c4e3aa6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.476460 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.476375 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-accelerators-collector-config\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577157 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577120 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.577157 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577160 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577429 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577210 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55debf3d-042d-4ec7-811c-39c4ba0d540d-metrics-client-ca\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577429 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:19:25.577297 2562 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:19:25.577429 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577327 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-sys\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577429 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:19:25.577365 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-tls podName:76362cf3-2b44-437b-bd99-b7048c4e3aa6 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:26.077349953 +0000 UTC m=+177.162799326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-5h6ps" (UID: "76362cf3-2b44-437b-bd99-b7048c4e3aa6") : secret "kube-state-metrics-tls" not found Apr 16 18:19:25.577429 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577383 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-sys\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577429 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577384 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-wtmp\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577722 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577440 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dc6lc\" (UniqueName: \"kubernetes.io/projected/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-api-access-dc6lc\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.577722 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577471 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.577722 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577489 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-tls\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577722 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577498 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-wtmp\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577722 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577507 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-textfile\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.577722 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577709 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.578019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577753 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-root\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.578019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577793 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-textfile\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.578019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577797 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6khvc\" (UniqueName: \"kubernetes.io/projected/55debf3d-042d-4ec7-811c-39c4ba0d540d-kube-api-access-6khvc\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.578019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577837 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/55debf3d-042d-4ec7-811c-39c4ba0d540d-root\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.578019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577861 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76362cf3-2b44-437b-bd99-b7048c4e3aa6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.578019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577879 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55debf3d-042d-4ec7-811c-39c4ba0d540d-metrics-client-ca\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.578019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577911 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76362cf3-2b44-437b-bd99-b7048c4e3aa6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.578019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.577940 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-accelerators-collector-config\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.578455 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.578431 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.578513 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.578465 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76362cf3-2b44-437b-bd99-b7048c4e3aa6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.578513 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.578498 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-accelerators-collector-config\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.578820 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.578798 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76362cf3-2b44-437b-bd99-b7048c4e3aa6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.579792 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.579758 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.579874 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.579835 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/55debf3d-042d-4ec7-811c-39c4ba0d540d-node-exporter-tls\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.579976 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.579959 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.585798 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.585777 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc6lc\" (UniqueName: \"kubernetes.io/projected/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-api-access-dc6lc\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:25.586357 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.586333 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6khvc\" (UniqueName: \"kubernetes.io/projected/55debf3d-042d-4ec7-811c-39c4ba0d540d-kube-api-access-6khvc\") pod \"node-exporter-b5w4l\" (UID: \"55debf3d-042d-4ec7-811c-39c4ba0d540d\") " pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.725554 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:25.725526 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b5w4l" Apr 16 18:19:25.733875 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:25.733846 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55debf3d_042d_4ec7_811c_39c4ba0d540d.slice/crio-93143b093dce3fd725184c76db9240394380d1cf4f1f0312154c03f555f315c6 WatchSource:0}: Error finding container 93143b093dce3fd725184c76db9240394380d1cf4f1f0312154c03f555f315c6: Status 404 returned error can't find the container with id 93143b093dce3fd725184c76db9240394380d1cf4f1f0312154c03f555f315c6 Apr 16 18:19:26.081960 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.081863 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:26.084150 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.084124 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76362cf3-2b44-437b-bd99-b7048c4e3aa6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-5h6ps\" (UID: \"76362cf3-2b44-437b-bd99-b7048c4e3aa6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:26.192753 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.192713 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b5w4l" event={"ID":"55debf3d-042d-4ec7-811c-39c4ba0d540d","Type":"ContainerStarted","Data":"93143b093dce3fd725184c76db9240394380d1cf4f1f0312154c03f555f315c6"} Apr 16 18:19:26.335400 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.335320 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" Apr 16 18:19:26.512485 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.512457 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:19:26.517108 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.517086 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.519692 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.519670 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:19:26.520002 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.519983 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:19:26.520370 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.520205 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:19:26.520370 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.520237 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:19:26.520370 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.520242 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:19:26.520370 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.520275 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:19:26.520370 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.520257 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m9dzg\"" Apr 16 18:19:26.520833 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.520739 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:19:26.520833 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.520794 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:19:26.520833 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.520799 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:19:26.531243 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.531166 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:19:26.570435 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.570410 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-5h6ps"] Apr 16 18:19:26.573606 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:26.573582 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76362cf3_2b44_437b_bd99_b7048c4e3aa6.slice/crio-544fd77f91ee1e741ab3d96b16585f610586d6d4b0662f311c0bd4197a0a1158 WatchSource:0}: Error finding container 544fd77f91ee1e741ab3d96b16585f610586d6d4b0662f311c0bd4197a0a1158: Status 404 returned error can't find the container with id 544fd77f91ee1e741ab3d96b16585f610586d6d4b0662f311c0bd4197a0a1158 Apr 16 18:19:26.586519 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586452 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv9k4\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-kube-api-access-bv9k4\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.586519 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586496 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.586767 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586527 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-volume\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.586767 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586600 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-out\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.586767 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586643 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.586767 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586675 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.586767 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586704 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.587026 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586792 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.587026 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586826 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.587026 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586856 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-web-config\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.587026 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586893 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.587026 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586921 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.587026 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.586967 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688277 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688244 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688454 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688294 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688454 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688322 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688454 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688351 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688454 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688378 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688454 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688403 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-web-config\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688698 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688594 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688698 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688637 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688698 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688686 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688852 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688723 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv9k4\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-kube-api-access-bv9k4\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688852 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688754 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688852 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688784 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-volume\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.688852 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688839 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.689056 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.688875 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-out\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.691588 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.691559 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.691720 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.691606 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.691848 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.691824 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-web-config\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.692064 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.691969 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.696260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.692586 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.696260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.692861 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.696260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.693038 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.696260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.693246 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-volume\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.696260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.694455 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-out\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.696260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.695135 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.696260 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.695598 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.699256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.699232 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv9k4\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-kube-api-access-bv9k4\") pod \"alertmanager-main-0\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.850714 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.850635 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:19:26.979436 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:26.979401 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:19:26.982782 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:26.982748 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1854add1_b31b_41e4_b956_a96afbbfcf9f.slice/crio-22549cf2bb9c33bdc92367c0b90881cac2e47285a7787e2baa0d711086e1bb05 WatchSource:0}: Error finding container 22549cf2bb9c33bdc92367c0b90881cac2e47285a7787e2baa0d711086e1bb05: Status 404 returned error can't find the container with id 22549cf2bb9c33bdc92367c0b90881cac2e47285a7787e2baa0d711086e1bb05 Apr 16 18:19:27.197046 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:27.197009 2562 generic.go:358] "Generic (PLEG): container finished" podID="55debf3d-042d-4ec7-811c-39c4ba0d540d" containerID="3168605e4a1288253c6dbec0461643b8949f2cfdaa294c67fb6a822012f49750" exitCode=0 Apr 16 18:19:27.197276 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:27.197101 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b5w4l" event={"ID":"55debf3d-042d-4ec7-811c-39c4ba0d540d","Type":"ContainerDied","Data":"3168605e4a1288253c6dbec0461643b8949f2cfdaa294c67fb6a822012f49750"} Apr 16 18:19:27.198415 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:27.198388 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerStarted","Data":"22549cf2bb9c33bdc92367c0b90881cac2e47285a7787e2baa0d711086e1bb05"} Apr 16 18:19:27.199519 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:27.199496 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" event={"ID":"76362cf3-2b44-437b-bd99-b7048c4e3aa6","Type":"ContainerStarted","Data":"544fd77f91ee1e741ab3d96b16585f610586d6d4b0662f311c0bd4197a0a1158"} Apr 16 18:19:28.205533 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:28.205499 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" event={"ID":"76362cf3-2b44-437b-bd99-b7048c4e3aa6","Type":"ContainerStarted","Data":"42fe09b7f1b559f0b958ce0679c815c316517d0ea864e454180955c38f89c926"} Apr 16 18:19:28.205989 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:28.205541 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" event={"ID":"76362cf3-2b44-437b-bd99-b7048c4e3aa6","Type":"ContainerStarted","Data":"c1202b164a35c319ea352dc8b1fe92fea44ef459d166b4245f885e630f50f64b"} Apr 16 18:19:28.205989 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:28.205561 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" event={"ID":"76362cf3-2b44-437b-bd99-b7048c4e3aa6","Type":"ContainerStarted","Data":"ffbb9c6f93203b9b041192368c5734f42f04f410c3fde98051f936758cebd0fb"} Apr 16 18:19:28.210395 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:28.208367 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b5w4l" event={"ID":"55debf3d-042d-4ec7-811c-39c4ba0d540d","Type":"ContainerStarted","Data":"4d469a96ccb132eeb476277f2eecb557aaa7261d7125e0deca45d0235c687344"} Apr 16 18:19:28.210395 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:28.208404 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b5w4l" event={"ID":"55debf3d-042d-4ec7-811c-39c4ba0d540d","Type":"ContainerStarted","Data":"e3139d748917c108ee69a208e9d5c2404058eaf2f379fb1c89aac933e9f819c1"} Apr 16 18:19:28.226708 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:28.226648 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-5h6ps" podStartSLOduration=2.064348146 podStartE2EDuration="3.226627685s" podCreationTimestamp="2026-04-16 18:19:25 +0000 UTC" firstStartedPulling="2026-04-16 18:19:26.575447529 +0000 UTC m=+177.660896899" lastFinishedPulling="2026-04-16 18:19:27.73772707 +0000 UTC m=+178.823176438" observedRunningTime="2026-04-16 18:19:28.224984987 +0000 UTC m=+179.310434379" watchObservedRunningTime="2026-04-16 18:19:28.226627685 +0000 UTC m=+179.312077077" Apr 16 18:19:28.248673 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:28.248624 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-b5w4l" podStartSLOduration=2.504635304 podStartE2EDuration="3.248608972s" podCreationTimestamp="2026-04-16 18:19:25 +0000 UTC" firstStartedPulling="2026-04-16 18:19:25.738106785 +0000 UTC m=+176.823556154" lastFinishedPulling="2026-04-16 18:19:26.482080433 +0000 UTC m=+177.567529822" observedRunningTime="2026-04-16 18:19:28.248069724 +0000 UTC m=+179.333519128" watchObservedRunningTime="2026-04-16 18:19:28.248608972 +0000 UTC m=+179.334058362" Apr 16 18:19:29.212453 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.212417 2562 generic.go:358] "Generic (PLEG): container finished" podID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerID="10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d" exitCode=0 Apr 16 18:19:29.212851 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.212497 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerDied","Data":"10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d"} Apr 16 18:19:29.942973 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.942930 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-55c9f64cb6-5j7rb"] Apr 16 18:19:29.945983 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.945958 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:29.950572 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.950542 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:19:29.950751 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.950725 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-lqlkg\"" Apr 16 18:19:29.950855 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.950800 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:19:29.951025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.950991 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:19:29.951118 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.951017 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9ensm3a9updgc\"" Apr 16 18:19:29.951542 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.951515 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:19:29.968071 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:29.968041 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55c9f64cb6-5j7rb"] Apr 16 18:19:30.016942 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.016906 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d04312b8-1687-41a0-a148-9677c142ead3-metrics-server-audit-profiles\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.016942 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.016944 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tx62\" (UniqueName: \"kubernetes.io/projected/d04312b8-1687-41a0-a148-9677c142ead3-kube-api-access-6tx62\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.017166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.016973 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d04312b8-1687-41a0-a148-9677c142ead3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.017166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.017099 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d04312b8-1687-41a0-a148-9677c142ead3-audit-log\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.017166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.017133 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-client-ca-bundle\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.017166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.017162 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-secret-metrics-server-tls\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.017422 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.017225 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-secret-metrics-server-client-certs\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.118243 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.118180 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d04312b8-1687-41a0-a148-9677c142ead3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.118393 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.118302 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d04312b8-1687-41a0-a148-9677c142ead3-audit-log\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.118393 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.118330 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-client-ca-bundle\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.118393 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.118348 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-secret-metrics-server-tls\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.118393 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.118380 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-secret-metrics-server-client-certs\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.118556 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.118449 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d04312b8-1687-41a0-a148-9677c142ead3-metrics-server-audit-profiles\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.118556 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.118477 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tx62\" (UniqueName: \"kubernetes.io/projected/d04312b8-1687-41a0-a148-9677c142ead3-kube-api-access-6tx62\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.118721 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.118699 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d04312b8-1687-41a0-a148-9677c142ead3-audit-log\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.119062 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.119017 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d04312b8-1687-41a0-a148-9677c142ead3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.119367 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.119345 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d04312b8-1687-41a0-a148-9677c142ead3-metrics-server-audit-profiles\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.120852 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.120830 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-secret-metrics-server-client-certs\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.120984 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.120922 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-client-ca-bundle\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.120984 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.120937 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d04312b8-1687-41a0-a148-9677c142ead3-secret-metrics-server-tls\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.131775 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.131750 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tx62\" (UniqueName: \"kubernetes.io/projected/d04312b8-1687-41a0-a148-9677c142ead3-kube-api-access-6tx62\") pod \"metrics-server-55c9f64cb6-5j7rb\" (UID: \"d04312b8-1687-41a0-a148-9677c142ead3\") " pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.177167 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.177131 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7"] Apr 16 18:19:30.181449 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.181433 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" Apr 16 18:19:30.183882 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.183859 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:19:30.183882 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.183874 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-qgkjx\"" Apr 16 18:19:30.188123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.188100 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7"] Apr 16 18:19:30.219100 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.219021 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8c889714-3a36-4fdd-b6ba-51298672e02f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-wqvh7\" (UID: \"8c889714-3a36-4fdd-b6ba-51298672e02f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" Apr 16 18:19:30.257148 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.257113 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:30.320774 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.320172 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8c889714-3a36-4fdd-b6ba-51298672e02f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-wqvh7\" (UID: \"8c889714-3a36-4fdd-b6ba-51298672e02f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" Apr 16 18:19:30.320774 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:19:30.320389 2562 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:19:30.320774 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:19:30.320461 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c889714-3a36-4fdd-b6ba-51298672e02f-monitoring-plugin-cert podName:8c889714-3a36-4fdd-b6ba-51298672e02f nodeName:}" failed. No retries permitted until 2026-04-16 18:19:30.820440077 +0000 UTC m=+181.905889467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/8c889714-3a36-4fdd-b6ba-51298672e02f-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-wqvh7" (UID: "8c889714-3a36-4fdd-b6ba-51298672e02f") : secret "monitoring-plugin-cert" not found Apr 16 18:19:30.380659 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.380625 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55c9f64cb6-5j7rb"] Apr 16 18:19:30.383470 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:30.383450 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd04312b8_1687_41a0_a148_9677c142ead3.slice/crio-9535f08ed803f5ea0a5d0ea0fdef3ace4e2318f86025c1fa253e34fa6f50b63c WatchSource:0}: Error finding container 9535f08ed803f5ea0a5d0ea0fdef3ace4e2318f86025c1fa253e34fa6f50b63c: Status 404 returned error can't find the container with id 9535f08ed803f5ea0a5d0ea0fdef3ace4e2318f86025c1fa253e34fa6f50b63c Apr 16 18:19:30.825608 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.825572 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8c889714-3a36-4fdd-b6ba-51298672e02f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-wqvh7\" (UID: \"8c889714-3a36-4fdd-b6ba-51298672e02f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" Apr 16 18:19:30.828609 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:30.828578 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8c889714-3a36-4fdd-b6ba-51298672e02f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-wqvh7\" (UID: \"8c889714-3a36-4fdd-b6ba-51298672e02f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" Apr 16 18:19:31.090941 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.090849 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" Apr 16 18:19:31.219949 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.219903 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" event={"ID":"d04312b8-1687-41a0-a148-9677c142ead3","Type":"ContainerStarted","Data":"9535f08ed803f5ea0a5d0ea0fdef3ace4e2318f86025c1fa253e34fa6f50b63c"} Apr 16 18:19:31.618461 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.618427 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:19:31.622537 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.622508 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.625123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.625101 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:19:31.625640 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.625621 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-r6rnm\"" Apr 16 18:19:31.625981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.625957 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:19:31.627562 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.627540 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:19:31.627735 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.627713 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:19:31.627850 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.627793 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:19:31.628406 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.628386 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:19:31.628559 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.628538 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:19:31.628684 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.628570 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:19:31.628781 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.628758 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:19:31.629073 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.629055 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bu6cc6r7m8ird\"" Apr 16 18:19:31.629989 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.629957 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:19:31.630722 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.630703 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:19:31.633159 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.633141 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:19:31.644031 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.644002 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:19:31.734058 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734016 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734265 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734068 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734265 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734106 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734265 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734129 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734265 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734154 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8gcr\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-kube-api-access-h8gcr\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734265 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734250 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-web-config\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734494 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734309 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734494 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734341 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734494 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734376 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734494 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734415 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734494 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734436 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734494 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734482 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config-out\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734735 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734541 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734735 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734566 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734735 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734584 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734735 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734652 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734735 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734679 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.734735 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.734717 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.835485 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835448 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config-out\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.835647 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835497 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.835647 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835524 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.835647 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835550 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.835647 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835604 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.835647 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835633 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835655 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835683 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835720 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835748 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835771 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835803 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8gcr\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-kube-api-access-h8gcr\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835832 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-web-config\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835865 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835904 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835933 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.835979 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.836003 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836613 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.836543 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.836874 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.836767 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.837331 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.837305 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.838880 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.838658 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.838991 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.838931 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-web-config\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.838991 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.838941 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.839256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.839224 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.839789 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.839744 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.839877 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.839826 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.839966 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.839878 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.840076 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.840054 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config-out\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.840559 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.840534 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.841075 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.841053 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.841896 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.841877 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.842036 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.842014 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.842353 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.842322 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.842426 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.842359 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.844439 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.844419 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8gcr\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-kube-api-access-h8gcr\") pod \"prometheus-k8s-0\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.937524 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.935372 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:31.986906 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:31.986877 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7"] Apr 16 18:19:31.990994 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:31.990964 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c889714_3a36_4fdd_b6ba_51298672e02f.slice/crio-c87b295318840abab0717fd10fbabe3dde9b90212fa29f2e58f22b6e832a7569 WatchSource:0}: Error finding container c87b295318840abab0717fd10fbabe3dde9b90212fa29f2e58f22b6e832a7569: Status 404 returned error can't find the container with id c87b295318840abab0717fd10fbabe3dde9b90212fa29f2e58f22b6e832a7569 Apr 16 18:19:32.087340 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.087279 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:19:32.090359 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:19:32.090326 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1783d49b_8e3f_4ed3_be87_7b8a3c1d4141.slice/crio-95410369aff54908cc69e48ce39bc77b0fac8c2f93280ac1787cc479d6b223e4 WatchSource:0}: Error finding container 95410369aff54908cc69e48ce39bc77b0fac8c2f93280ac1787cc479d6b223e4: Status 404 returned error can't find the container with id 95410369aff54908cc69e48ce39bc77b0fac8c2f93280ac1787cc479d6b223e4 Apr 16 18:19:32.110659 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.110633 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:19:32.229365 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.229328 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" event={"ID":"d04312b8-1687-41a0-a148-9677c142ead3","Type":"ContainerStarted","Data":"d938309b3469872f94bef7a4dd0335f283869a2e7959d86d66213ddbfdb9d211"} Apr 16 18:19:32.230573 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.230548 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" event={"ID":"8c889714-3a36-4fdd-b6ba-51298672e02f","Type":"ContainerStarted","Data":"c87b295318840abab0717fd10fbabe3dde9b90212fa29f2e58f22b6e832a7569"} Apr 16 18:19:32.231932 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.231911 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerStarted","Data":"dda1dea3c25e5dea0f0c61b7647b0cca4aa2d5ab33e477a1ea7d68704b83bcf3"} Apr 16 18:19:32.232008 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.231939 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerStarted","Data":"95410369aff54908cc69e48ce39bc77b0fac8c2f93280ac1787cc479d6b223e4"} Apr 16 18:19:32.234699 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.234673 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerStarted","Data":"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b"} Apr 16 18:19:32.234699 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.234700 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerStarted","Data":"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd"} Apr 16 18:19:32.234815 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.234713 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerStarted","Data":"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad"} Apr 16 18:19:32.234815 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.234726 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerStarted","Data":"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729"} Apr 16 18:19:32.247233 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:32.247114 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" podStartSLOduration=1.776258393 podStartE2EDuration="3.247096107s" podCreationTimestamp="2026-04-16 18:19:29 +0000 UTC" firstStartedPulling="2026-04-16 18:19:30.385426716 +0000 UTC m=+181.470876086" lastFinishedPulling="2026-04-16 18:19:31.856264431 +0000 UTC m=+182.941713800" observedRunningTime="2026-04-16 18:19:32.24571554 +0000 UTC m=+183.331164932" watchObservedRunningTime="2026-04-16 18:19:32.247096107 +0000 UTC m=+183.332545499" Apr 16 18:19:33.242854 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:33.242770 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerStarted","Data":"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949"} Apr 16 18:19:33.242854 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:33.242816 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerStarted","Data":"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0"} Apr 16 18:19:33.244321 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:33.244296 2562 generic.go:358] "Generic (PLEG): container finished" podID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerID="dda1dea3c25e5dea0f0c61b7647b0cca4aa2d5ab33e477a1ea7d68704b83bcf3" exitCode=0 Apr 16 18:19:33.244429 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:33.244373 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerDied","Data":"dda1dea3c25e5dea0f0c61b7647b0cca4aa2d5ab33e477a1ea7d68704b83bcf3"} Apr 16 18:19:33.271623 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:33.271402 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.315204761 podStartE2EDuration="7.271354701s" podCreationTimestamp="2026-04-16 18:19:26 +0000 UTC" firstStartedPulling="2026-04-16 18:19:26.984581421 +0000 UTC m=+178.070030791" lastFinishedPulling="2026-04-16 18:19:32.940731357 +0000 UTC m=+184.026180731" observedRunningTime="2026-04-16 18:19:33.270020419 +0000 UTC m=+184.355469811" watchObservedRunningTime="2026-04-16 18:19:33.271354701 +0000 UTC m=+184.356804114" Apr 16 18:19:34.162741 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:34.162713 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6994467cc7-df69f" Apr 16 18:19:34.249820 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:34.249778 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" event={"ID":"8c889714-3a36-4fdd-b6ba-51298672e02f","Type":"ContainerStarted","Data":"9b25ebf200ec1786eaee737e9387f32f5e97c22a19a85f9114cdfb00a0e003d2"} Apr 16 18:19:34.250341 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:34.250305 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" Apr 16 18:19:34.254878 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:34.254850 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" Apr 16 18:19:34.270824 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:34.270769 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wqvh7" podStartSLOduration=2.649795266 podStartE2EDuration="4.270750755s" podCreationTimestamp="2026-04-16 18:19:30 +0000 UTC" firstStartedPulling="2026-04-16 18:19:31.992940499 +0000 UTC m=+183.078389873" lastFinishedPulling="2026-04-16 18:19:33.613895982 +0000 UTC m=+184.699345362" observedRunningTime="2026-04-16 18:19:34.269234877 +0000 UTC m=+185.354684262" watchObservedRunningTime="2026-04-16 18:19:34.270750755 +0000 UTC m=+185.356200151" Apr 16 18:19:36.259055 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:36.259015 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerStarted","Data":"ca5b44c0f3a692d12358376d9420816cd24ebb761ebf63eb8757752f009449a1"} Apr 16 18:19:36.259055 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:36.259060 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerStarted","Data":"7da9a1db543a25c41e5915e29345de8cc01dc8df0b4ac77d5c95d4b901cfeb3d"} Apr 16 18:19:37.124806 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.124763 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-75585f59db-85qwd" podUID="55883b38-898b-4426-9e22-f96a487c90c6" containerName="registry" containerID="cri-o://66eb5f33f14ef8938585f052f792a2d9d1e3f816246da7c34d126ad45f15d431" gracePeriod=30 Apr 16 18:19:37.263736 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.263702 2562 generic.go:358] "Generic (PLEG): container finished" podID="55883b38-898b-4426-9e22-f96a487c90c6" containerID="66eb5f33f14ef8938585f052f792a2d9d1e3f816246da7c34d126ad45f15d431" exitCode=0 Apr 16 18:19:37.264106 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.263753 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75585f59db-85qwd" event={"ID":"55883b38-898b-4426-9e22-f96a487c90c6","Type":"ContainerDied","Data":"66eb5f33f14ef8938585f052f792a2d9d1e3f816246da7c34d126ad45f15d431"} Apr 16 18:19:37.844589 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.844316 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:19:37.896487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896429 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-trusted-ca\") pod \"55883b38-898b-4426-9e22-f96a487c90c6\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " Apr 16 18:19:37.896487 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896466 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") pod \"55883b38-898b-4426-9e22-f96a487c90c6\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " Apr 16 18:19:37.896603 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896496 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-registry-certificates\") pod \"55883b38-898b-4426-9e22-f96a487c90c6\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " Apr 16 18:19:37.896603 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896522 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-bound-sa-token\") pod \"55883b38-898b-4426-9e22-f96a487c90c6\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " Apr 16 18:19:37.896603 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896586 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-image-registry-private-configuration\") pod \"55883b38-898b-4426-9e22-f96a487c90c6\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " Apr 16 18:19:37.896734 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896628 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkdfw\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-kube-api-access-dkdfw\") pod \"55883b38-898b-4426-9e22-f96a487c90c6\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " Apr 16 18:19:37.896734 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896656 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-installation-pull-secrets\") pod \"55883b38-898b-4426-9e22-f96a487c90c6\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " Apr 16 18:19:37.896734 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896700 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55883b38-898b-4426-9e22-f96a487c90c6-ca-trust-extracted\") pod \"55883b38-898b-4426-9e22-f96a487c90c6\" (UID: \"55883b38-898b-4426-9e22-f96a487c90c6\") " Apr 16 18:19:37.897178 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.896873 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "55883b38-898b-4426-9e22-f96a487c90c6" (UID: "55883b38-898b-4426-9e22-f96a487c90c6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:37.897178 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.897099 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "55883b38-898b-4426-9e22-f96a487c90c6" (UID: "55883b38-898b-4426-9e22-f96a487c90c6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:37.897445 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.897423 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-trusted-ca\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:19:37.897508 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.897453 2562 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55883b38-898b-4426-9e22-f96a487c90c6-registry-certificates\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:19:37.899983 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.899620 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "55883b38-898b-4426-9e22-f96a487c90c6" (UID: "55883b38-898b-4426-9e22-f96a487c90c6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:37.899983 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.899639 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "55883b38-898b-4426-9e22-f96a487c90c6" (UID: "55883b38-898b-4426-9e22-f96a487c90c6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:37.899983 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.899658 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "55883b38-898b-4426-9e22-f96a487c90c6" (UID: "55883b38-898b-4426-9e22-f96a487c90c6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:37.900216 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.899993 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "55883b38-898b-4426-9e22-f96a487c90c6" (UID: "55883b38-898b-4426-9e22-f96a487c90c6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:37.900216 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.900041 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-kube-api-access-dkdfw" (OuterVolumeSpecName: "kube-api-access-dkdfw") pod "55883b38-898b-4426-9e22-f96a487c90c6" (UID: "55883b38-898b-4426-9e22-f96a487c90c6"). InnerVolumeSpecName "kube-api-access-dkdfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:37.908457 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.908425 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55883b38-898b-4426-9e22-f96a487c90c6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "55883b38-898b-4426-9e22-f96a487c90c6" (UID: "55883b38-898b-4426-9e22-f96a487c90c6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:37.998802 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.998769 2562 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-registry-tls\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:19:37.998802 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.998794 2562 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-bound-sa-token\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:19:37.998802 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.998805 2562 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-image-registry-private-configuration\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:19:37.998987 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.998816 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dkdfw\" (UniqueName: \"kubernetes.io/projected/55883b38-898b-4426-9e22-f96a487c90c6-kube-api-access-dkdfw\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:19:37.998987 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.998826 2562 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55883b38-898b-4426-9e22-f96a487c90c6-installation-pull-secrets\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:19:37.998987 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:37.998836 2562 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55883b38-898b-4426-9e22-f96a487c90c6-ca-trust-extracted\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:19:38.268298 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.268211 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75585f59db-85qwd" Apr 16 18:19:38.268714 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.268213 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75585f59db-85qwd" event={"ID":"55883b38-898b-4426-9e22-f96a487c90c6","Type":"ContainerDied","Data":"d8af525dddc2309cbe14f5d7787881d1db64ed3836b6d72423d1f991b771cfdc"} Apr 16 18:19:38.268714 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.268342 2562 scope.go:117] "RemoveContainer" containerID="66eb5f33f14ef8938585f052f792a2d9d1e3f816246da7c34d126ad45f15d431" Apr 16 18:19:38.271501 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.271460 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerStarted","Data":"38f89b64f9c34c84787732301b2dd5be226265bff4949e9b35f342ec1194557c"} Apr 16 18:19:38.271501 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.271496 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerStarted","Data":"c5ba245a2df2f9026b9d96d9e12b64b0b6616ce7aa9959832f45fec0ccea7697"} Apr 16 18:19:38.271651 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.271510 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerStarted","Data":"560859b45c85ee3369d06784219eb58e4766d8d7c55fc01b9fcca9a09c49a65a"} Apr 16 18:19:38.271651 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.271523 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerStarted","Data":"70217f856ff192114119a2793d56b01b0efdbfe7e0df28fc87986cdb488dde88"} Apr 16 18:19:38.304834 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.304787 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.797460399 podStartE2EDuration="7.304770692s" podCreationTimestamp="2026-04-16 18:19:31 +0000 UTC" firstStartedPulling="2026-04-16 18:19:33.24550178 +0000 UTC m=+184.330951149" lastFinishedPulling="2026-04-16 18:19:37.752812054 +0000 UTC m=+188.838261442" observedRunningTime="2026-04-16 18:19:38.297964469 +0000 UTC m=+189.383413860" watchObservedRunningTime="2026-04-16 18:19:38.304770692 +0000 UTC m=+189.390220083" Apr 16 18:19:38.316318 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.316290 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-75585f59db-85qwd"] Apr 16 18:19:38.319936 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:38.319909 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-75585f59db-85qwd"] Apr 16 18:19:39.523033 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:39.522999 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55883b38-898b-4426-9e22-f96a487c90c6" path="/var/lib/kubelet/pods/55883b38-898b-4426-9e22-f96a487c90c6/volumes" Apr 16 18:19:41.936325 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:41.936281 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:19:50.257415 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:50.257374 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:19:50.257805 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:19:50.257456 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:20:03.348024 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:03.347987 2562 generic.go:358] "Generic (PLEG): container finished" podID="ac44aa61-86ed-41f4-aab0-bbabab9224b1" containerID="34a875e4b3ca064323135ad75840d09046136ff75ba6132aa7ad8359c519a8a1" exitCode=0 Apr 16 18:20:03.348427 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:03.348058 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" event={"ID":"ac44aa61-86ed-41f4-aab0-bbabab9224b1","Type":"ContainerDied","Data":"34a875e4b3ca064323135ad75840d09046136ff75ba6132aa7ad8359c519a8a1"} Apr 16 18:20:03.348473 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:03.348427 2562 scope.go:117] "RemoveContainer" containerID="34a875e4b3ca064323135ad75840d09046136ff75ba6132aa7ad8359c519a8a1" Apr 16 18:20:04.351925 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:04.351887 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-dwl9r" event={"ID":"ac44aa61-86ed-41f4-aab0-bbabab9224b1","Type":"ContainerStarted","Data":"5913db191fc3bd733ef316219568f9f12f4a1c19c23b2ff5db166ca882a41432"} Apr 16 18:20:10.263449 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:10.263417 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:20:10.267255 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:10.267229 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-55c9f64cb6-5j7rb" Apr 16 18:20:17.398717 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:17.398684 2562 generic.go:358] "Generic (PLEG): container finished" podID="6f57c457-af5c-41b7-b405-ac38ae8bd95a" containerID="ff9ddf4ddbc7cb03154eca9d48b27cf6020b31b12ee438a1107dc8a48705c6c0" exitCode=0 Apr 16 18:20:17.399141 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:17.398735 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" event={"ID":"6f57c457-af5c-41b7-b405-ac38ae8bd95a","Type":"ContainerDied","Data":"ff9ddf4ddbc7cb03154eca9d48b27cf6020b31b12ee438a1107dc8a48705c6c0"} Apr 16 18:20:17.399141 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:17.399026 2562 scope.go:117] "RemoveContainer" containerID="ff9ddf4ddbc7cb03154eca9d48b27cf6020b31b12ee438a1107dc8a48705c6c0" Apr 16 18:20:18.402955 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:18.402923 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-72hl7" event={"ID":"6f57c457-af5c-41b7-b405-ac38ae8bd95a","Type":"ContainerStarted","Data":"7fc8040aaf44a37d0fe088ab43231deb3ac78595d962739ca27bf51cd12734a0"} Apr 16 18:20:31.936835 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:31.936795 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:31.956539 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:31.956513 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:32.465289 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:32.465255 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:40.263092 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:40.263057 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:20:40.265316 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:40.265291 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7682219f-20c8-40ee-a84d-c68d79df1dd8-metrics-certs\") pod \"network-metrics-daemon-4vgjf\" (UID: \"7682219f-20c8-40ee-a84d-c68d79df1dd8\") " pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:20:40.423132 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:40.423105 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pwdr4\"" Apr 16 18:20:40.430351 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:40.430330 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vgjf" Apr 16 18:20:40.554063 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:40.553861 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vgjf"] Apr 16 18:20:40.556107 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:20:40.556081 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7682219f_20c8_40ee_a84d_c68d79df1dd8.slice/crio-34447bbf68f1a532e32e162e8fcc0c9ca5a1475626aee88dab38ce247af75e47 WatchSource:0}: Error finding container 34447bbf68f1a532e32e162e8fcc0c9ca5a1475626aee88dab38ce247af75e47: Status 404 returned error can't find the container with id 34447bbf68f1a532e32e162e8fcc0c9ca5a1475626aee88dab38ce247af75e47 Apr 16 18:20:41.481067 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:41.481032 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vgjf" event={"ID":"7682219f-20c8-40ee-a84d-c68d79df1dd8","Type":"ContainerStarted","Data":"34447bbf68f1a532e32e162e8fcc0c9ca5a1475626aee88dab38ce247af75e47"} Apr 16 18:20:42.485096 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:42.485059 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vgjf" event={"ID":"7682219f-20c8-40ee-a84d-c68d79df1dd8","Type":"ContainerStarted","Data":"c2b3df3b55fffd9a2fcc33453840fb9b82322366380ea07e3c8e03d86a84097e"} Apr 16 18:20:42.485096 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:42.485101 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vgjf" event={"ID":"7682219f-20c8-40ee-a84d-c68d79df1dd8","Type":"ContainerStarted","Data":"a3c43af14d504778c95f5fb303d3e238751e60547fddf155b697d877b273693e"} Apr 16 18:20:42.501261 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:42.501206 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4vgjf" podStartSLOduration=252.572007955 podStartE2EDuration="4m13.501174482s" podCreationTimestamp="2026-04-16 18:16:29 +0000 UTC" firstStartedPulling="2026-04-16 18:20:40.558176558 +0000 UTC m=+251.643625930" lastFinishedPulling="2026-04-16 18:20:41.487343079 +0000 UTC m=+252.572792457" observedRunningTime="2026-04-16 18:20:42.500881605 +0000 UTC m=+253.586331000" watchObservedRunningTime="2026-04-16 18:20:42.501174482 +0000 UTC m=+253.586623872" Apr 16 18:20:45.721123 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:45.721083 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:45.722090 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:45.722055 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="alertmanager" containerID="cri-o://beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729" gracePeriod=120 Apr 16 18:20:45.722249 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:45.722109 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy-metric" containerID="cri-o://e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0" gracePeriod=120 Apr 16 18:20:45.722249 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:45.722136 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy-web" containerID="cri-o://1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd" gracePeriod=120 Apr 16 18:20:45.722373 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:45.722233 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="prom-label-proxy" containerID="cri-o://2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949" gracePeriod=120 Apr 16 18:20:45.722373 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:45.722268 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy" containerID="cri-o://b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b" gracePeriod=120 Apr 16 18:20:45.722470 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:45.722414 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="config-reloader" containerID="cri-o://eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad" gracePeriod=120 Apr 16 18:20:46.502070 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.502034 2562 generic.go:358] "Generic (PLEG): container finished" podID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerID="2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949" exitCode=0 Apr 16 18:20:46.502070 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.502061 2562 generic.go:358] "Generic (PLEG): container finished" podID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerID="b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b" exitCode=0 Apr 16 18:20:46.502070 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.502068 2562 generic.go:358] "Generic (PLEG): container finished" podID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerID="eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad" exitCode=0 Apr 16 18:20:46.502070 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.502075 2562 generic.go:358] "Generic (PLEG): container finished" podID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerID="beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729" exitCode=0 Apr 16 18:20:46.502379 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.502107 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerDied","Data":"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949"} Apr 16 18:20:46.502379 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.502145 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerDied","Data":"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b"} Apr 16 18:20:46.502379 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.502163 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerDied","Data":"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad"} Apr 16 18:20:46.502379 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.502179 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerDied","Data":"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729"} Apr 16 18:20:46.978235 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:46.978212 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.120012 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.119925 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-main-tls\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120012 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.119960 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120250 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120054 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-cluster-tls-config\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120250 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120091 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-trusted-ca-bundle\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120250 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120120 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-volume\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120250 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120151 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv9k4\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-kube-api-access-bv9k4\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120250 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120176 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-web\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120250 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120241 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-out\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120570 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120269 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-main-db\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120570 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120353 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-tls-assets\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120570 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120404 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-web-config\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120570 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120438 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-metrics-client-ca\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120570 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120469 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"1854add1-b31b-41e4-b956-a96afbbfcf9f\" (UID: \"1854add1-b31b-41e4-b956-a96afbbfcf9f\") " Apr 16 18:20:47.120851 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120567 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:47.120851 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.120754 2562 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.122665 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.122455 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-volume" (OuterVolumeSpecName: "config-volume") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:47.122665 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.122633 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:47.122898 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.122872 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-kube-api-access-bv9k4" (OuterVolumeSpecName: "kube-api-access-bv9k4") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "kube-api-access-bv9k4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:47.122976 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.122895 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:47.123247 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.123223 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:47.123408 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.123366 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:47.123733 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.123708 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:47.124089 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.124064 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-out" (OuterVolumeSpecName: "config-out") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:47.124403 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.124377 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:47.124514 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.124498 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:47.128018 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.127989 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:47.134455 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.134430 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-web-config" (OuterVolumeSpecName: "web-config") pod "1854add1-b31b-41e4-b956-a96afbbfcf9f" (UID: "1854add1-b31b-41e4-b956-a96afbbfcf9f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:47.221769 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221730 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-out\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.221769 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221762 2562 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1854add1-b31b-41e4-b956-a96afbbfcf9f-alertmanager-main-db\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.221769 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221778 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-tls-assets\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221792 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-web-config\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221803 2562 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1854add1-b31b-41e4-b956-a96afbbfcf9f-metrics-client-ca\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221815 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221828 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-main-tls\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221842 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221857 2562 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-cluster-tls-config\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221868 2562 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-config-volume\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221880 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bv9k4\" (UniqueName: \"kubernetes.io/projected/1854add1-b31b-41e4-b956-a96afbbfcf9f-kube-api-access-bv9k4\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.222025 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.221891 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1854add1-b31b-41e4-b956-a96afbbfcf9f-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:47.509485 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.509449 2562 generic.go:358] "Generic (PLEG): container finished" podID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerID="e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0" exitCode=0 Apr 16 18:20:47.509485 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.509475 2562 generic.go:358] "Generic (PLEG): container finished" podID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerID="1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd" exitCode=0 Apr 16 18:20:47.509721 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.509537 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerDied","Data":"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0"} Apr 16 18:20:47.509721 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.509558 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.509721 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.509577 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerDied","Data":"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd"} Apr 16 18:20:47.509721 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.509589 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1854add1-b31b-41e4-b956-a96afbbfcf9f","Type":"ContainerDied","Data":"22549cf2bb9c33bdc92367c0b90881cac2e47285a7787e2baa0d711086e1bb05"} Apr 16 18:20:47.509721 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.509604 2562 scope.go:117] "RemoveContainer" containerID="2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949" Apr 16 18:20:47.517233 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.517214 2562 scope.go:117] "RemoveContainer" containerID="e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0" Apr 16 18:20:47.524386 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.524367 2562 scope.go:117] "RemoveContainer" containerID="b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b" Apr 16 18:20:47.531348 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.531329 2562 scope.go:117] "RemoveContainer" containerID="1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd" Apr 16 18:20:47.537361 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.537323 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:47.540243 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.540207 2562 scope.go:117] "RemoveContainer" containerID="eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad" Apr 16 18:20:47.541877 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.541850 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:47.546960 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.546937 2562 scope.go:117] "RemoveContainer" containerID="beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729" Apr 16 18:20:47.553508 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.553489 2562 scope.go:117] "RemoveContainer" containerID="10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d" Apr 16 18:20:47.559957 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.559941 2562 scope.go:117] "RemoveContainer" containerID="2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949" Apr 16 18:20:47.560237 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:20:47.560216 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949\": container with ID starting with 2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949 not found: ID does not exist" containerID="2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949" Apr 16 18:20:47.560305 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.560245 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949"} err="failed to get container status \"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949\": rpc error: code = NotFound desc = could not find container \"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949\": container with ID starting with 2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949 not found: ID does not exist" Apr 16 18:20:47.560305 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.560278 2562 scope.go:117] "RemoveContainer" containerID="e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0" Apr 16 18:20:47.560489 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:20:47.560473 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0\": container with ID starting with e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0 not found: ID does not exist" containerID="e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0" Apr 16 18:20:47.560530 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.560496 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0"} err="failed to get container status \"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0\": rpc error: code = NotFound desc = could not find container \"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0\": container with ID starting with e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0 not found: ID does not exist" Apr 16 18:20:47.560530 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.560515 2562 scope.go:117] "RemoveContainer" containerID="b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b" Apr 16 18:20:47.560793 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:20:47.560767 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b\": container with ID starting with b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b not found: ID does not exist" containerID="b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b" Apr 16 18:20:47.560882 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.560794 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b"} err="failed to get container status \"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b\": rpc error: code = NotFound desc = could not find container \"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b\": container with ID starting with b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b not found: ID does not exist" Apr 16 18:20:47.560882 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.560814 2562 scope.go:117] "RemoveContainer" containerID="1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd" Apr 16 18:20:47.561176 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:20:47.561118 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd\": container with ID starting with 1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd not found: ID does not exist" containerID="1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd" Apr 16 18:20:47.561280 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.561181 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd"} err="failed to get container status \"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd\": rpc error: code = NotFound desc = could not find container \"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd\": container with ID starting with 1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd not found: ID does not exist" Apr 16 18:20:47.561280 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.561232 2562 scope.go:117] "RemoveContainer" containerID="eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad" Apr 16 18:20:47.561523 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:20:47.561499 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad\": container with ID starting with eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad not found: ID does not exist" containerID="eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad" Apr 16 18:20:47.561614 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.561530 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad"} err="failed to get container status \"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad\": rpc error: code = NotFound desc = could not find container \"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad\": container with ID starting with eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad not found: ID does not exist" Apr 16 18:20:47.561614 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.561547 2562 scope.go:117] "RemoveContainer" containerID="beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729" Apr 16 18:20:47.561769 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:20:47.561756 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729\": container with ID starting with beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729 not found: ID does not exist" containerID="beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729" Apr 16 18:20:47.561804 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.561773 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729"} err="failed to get container status \"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729\": rpc error: code = NotFound desc = could not find container \"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729\": container with ID starting with beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729 not found: ID does not exist" Apr 16 18:20:47.561804 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.561785 2562 scope.go:117] "RemoveContainer" containerID="10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d" Apr 16 18:20:47.561993 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:20:47.561977 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d\": container with ID starting with 10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d not found: ID does not exist" containerID="10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d" Apr 16 18:20:47.562034 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.561997 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d"} err="failed to get container status \"10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d\": rpc error: code = NotFound desc = could not find container \"10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d\": container with ID starting with 10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d not found: ID does not exist" Apr 16 18:20:47.562034 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562008 2562 scope.go:117] "RemoveContainer" containerID="2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949" Apr 16 18:20:47.562231 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562212 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949"} err="failed to get container status \"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949\": rpc error: code = NotFound desc = could not find container \"2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949\": container with ID starting with 2816de030fe13a4eb9d1bd7806912dff06436769a93273399265b89c62132949 not found: ID does not exist" Apr 16 18:20:47.562231 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562232 2562 scope.go:117] "RemoveContainer" containerID="e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0" Apr 16 18:20:47.562442 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562424 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0"} err="failed to get container status \"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0\": rpc error: code = NotFound desc = could not find container \"e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0\": container with ID starting with e929dcebcf4c2a6362ba7af523e2c6958fa5e8501ba402377e858a66d67eefd0 not found: ID does not exist" Apr 16 18:20:47.562501 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562451 2562 scope.go:117] "RemoveContainer" containerID="b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b" Apr 16 18:20:47.562685 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562670 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b"} err="failed to get container status \"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b\": rpc error: code = NotFound desc = could not find container \"b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b\": container with ID starting with b0772e7f4b82807829134e5b5115e7e0e4c45e7b167aed54df56c8a9eadc017b not found: ID does not exist" Apr 16 18:20:47.562719 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562686 2562 scope.go:117] "RemoveContainer" containerID="1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd" Apr 16 18:20:47.562892 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562871 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd"} err="failed to get container status \"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd\": rpc error: code = NotFound desc = could not find container \"1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd\": container with ID starting with 1fb8e342b214b5335cdce28c9eabb03ed0795a33f6355affd97a2a85840af0fd not found: ID does not exist" Apr 16 18:20:47.562943 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.562893 2562 scope.go:117] "RemoveContainer" containerID="eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad" Apr 16 18:20:47.563095 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.563079 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad"} err="failed to get container status \"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad\": rpc error: code = NotFound desc = could not find container \"eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad\": container with ID starting with eda276e1826017a78161b8dfbec1997e00d756c7c037cc9b6aef9f66f30a35ad not found: ID does not exist" Apr 16 18:20:47.563131 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.563096 2562 scope.go:117] "RemoveContainer" containerID="beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729" Apr 16 18:20:47.563290 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.563275 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729"} err="failed to get container status \"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729\": rpc error: code = NotFound desc = could not find container \"beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729\": container with ID starting with beb3f60fb2d587eaeb8a0d7ecddd5817b3e51a25321fa75d45fc47b30eaf6729 not found: ID does not exist" Apr 16 18:20:47.563336 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.563290 2562 scope.go:117] "RemoveContainer" containerID="10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d" Apr 16 18:20:47.563456 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.563442 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d"} err="failed to get container status \"10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d\": rpc error: code = NotFound desc = could not find container \"10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d\": container with ID starting with 10f98d5fd8866d8e84c4502d194843d908a91b271478fc412a42aed6403b9d9d not found: ID does not exist" Apr 16 18:20:47.571433 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571402 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:47.571691 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571678 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy-metric" Apr 16 18:20:47.571740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571693 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy-metric" Apr 16 18:20:47.571740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571706 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="config-reloader" Apr 16 18:20:47.571740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571712 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="config-reloader" Apr 16 18:20:47.571740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571719 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy-web" Apr 16 18:20:47.571740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571726 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy-web" Apr 16 18:20:47.571740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571735 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="init-config-reloader" Apr 16 18:20:47.571740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571740 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="init-config-reloader" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571747 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="prom-label-proxy" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571752 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="prom-label-proxy" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571760 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55883b38-898b-4426-9e22-f96a487c90c6" containerName="registry" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571765 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="55883b38-898b-4426-9e22-f96a487c90c6" containerName="registry" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571774 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="alertmanager" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571779 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="alertmanager" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571787 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571792 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571833 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="prom-label-proxy" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571841 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="alertmanager" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571847 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="config-reloader" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571852 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571857 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy-metric" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571872 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="55883b38-898b-4426-9e22-f96a487c90c6" containerName="registry" Apr 16 18:20:47.571939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.571878 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" containerName="kube-rbac-proxy-web" Apr 16 18:20:47.575101 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.575085 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.577823 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.577802 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:20:47.577954 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.577827 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:20:47.577954 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.577851 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:20:47.577954 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.577914 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:20:47.577954 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.577924 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:20:47.577954 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.577802 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:20:47.578228 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.578142 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:20:47.578228 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.578224 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:20:47.578462 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.578446 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m9dzg\"" Apr 16 18:20:47.589049 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.589027 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:20:47.599172 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.599146 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:47.726273 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726221 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a76b197-3ec2-40d0-a268-c2dde55da620-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726273 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726271 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxjsr\" (UniqueName: \"kubernetes.io/projected/1a76b197-3ec2-40d0-a268-c2dde55da620-kube-api-access-pxjsr\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726486 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726303 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726486 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726330 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726486 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726349 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726486 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726381 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726486 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726414 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726486 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726431 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a76b197-3ec2-40d0-a268-c2dde55da620-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726669 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726512 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a76b197-3ec2-40d0-a268-c2dde55da620-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726669 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726548 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-web-config\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726669 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726568 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a76b197-3ec2-40d0-a268-c2dde55da620-config-out\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726669 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726588 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a76b197-3ec2-40d0-a268-c2dde55da620-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.726669 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.726609 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.827728 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.827677 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a76b197-3ec2-40d0-a268-c2dde55da620-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.827728 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.827731 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-web-config\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.827942 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.827849 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a76b197-3ec2-40d0-a268-c2dde55da620-config-out\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.827942 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.827891 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a76b197-3ec2-40d0-a268-c2dde55da620-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.827942 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.827919 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828033 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.827964 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a76b197-3ec2-40d0-a268-c2dde55da620-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828033 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.827988 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxjsr\" (UniqueName: \"kubernetes.io/projected/1a76b197-3ec2-40d0-a268-c2dde55da620-kube-api-access-pxjsr\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828033 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.828019 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.828043 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.828084 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.828151 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828366 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.828205 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828366 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.828234 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a76b197-3ec2-40d0-a268-c2dde55da620-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.828551 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.828524 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a76b197-3ec2-40d0-a268-c2dde55da620-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.829409 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.829092 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a76b197-3ec2-40d0-a268-c2dde55da620-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831040 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831012 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831040 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831032 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a76b197-3ec2-40d0-a268-c2dde55da620-config-out\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831203 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831037 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a76b197-3ec2-40d0-a268-c2dde55da620-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831203 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831070 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831321 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831281 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a76b197-3ec2-40d0-a268-c2dde55da620-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831321 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831305 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831418 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831341 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831744 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831727 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-web-config\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.831859 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.831836 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.832994 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.832975 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a76b197-3ec2-40d0-a268-c2dde55da620-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.837134 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.837114 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxjsr\" (UniqueName: \"kubernetes.io/projected/1a76b197-3ec2-40d0-a268-c2dde55da620-kube-api-access-pxjsr\") pod \"alertmanager-main-0\" (UID: \"1a76b197-3ec2-40d0-a268-c2dde55da620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:47.889445 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:47.889413 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:48.019762 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:48.019734 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:48.021981 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:20:48.021959 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a76b197_3ec2_40d0_a268_c2dde55da620.slice/crio-2566bdfc1784b7325578362d5a093ef7de1418a6ae240e8faf327d0aea967491 WatchSource:0}: Error finding container 2566bdfc1784b7325578362d5a093ef7de1418a6ae240e8faf327d0aea967491: Status 404 returned error can't find the container with id 2566bdfc1784b7325578362d5a093ef7de1418a6ae240e8faf327d0aea967491 Apr 16 18:20:48.513853 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:48.513820 2562 generic.go:358] "Generic (PLEG): container finished" podID="1a76b197-3ec2-40d0-a268-c2dde55da620" containerID="61f3c85054f4e8cbc451dc4eda1a901de874c4c37d35e3f8ef9e42a36b310c71" exitCode=0 Apr 16 18:20:48.514014 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:48.513907 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a76b197-3ec2-40d0-a268-c2dde55da620","Type":"ContainerDied","Data":"61f3c85054f4e8cbc451dc4eda1a901de874c4c37d35e3f8ef9e42a36b310c71"} Apr 16 18:20:48.514014 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:48.513948 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a76b197-3ec2-40d0-a268-c2dde55da620","Type":"ContainerStarted","Data":"2566bdfc1784b7325578362d5a093ef7de1418a6ae240e8faf327d0aea967491"} Apr 16 18:20:49.522964 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.522924 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1854add1-b31b-41e4-b956-a96afbbfcf9f" path="/var/lib/kubelet/pods/1854add1-b31b-41e4-b956-a96afbbfcf9f/volumes" Apr 16 18:20:49.523450 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.523432 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a76b197-3ec2-40d0-a268-c2dde55da620","Type":"ContainerStarted","Data":"608879da87aaca0af5a6ee9d254e4682e3a12524af48cf85e26ed85faabd60fb"} Apr 16 18:20:49.523492 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.523477 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a76b197-3ec2-40d0-a268-c2dde55da620","Type":"ContainerStarted","Data":"714aa58c2eb87a39014ffd18a8d18b340978b3e6593d278d3a8d9e5c5ae92064"} Apr 16 18:20:49.523492 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.523489 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a76b197-3ec2-40d0-a268-c2dde55da620","Type":"ContainerStarted","Data":"8bc4afd505af49b3b496db1d26fed417b58c8dd0d80041fcce380cfe050d4980"} Apr 16 18:20:49.523552 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.523497 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a76b197-3ec2-40d0-a268-c2dde55da620","Type":"ContainerStarted","Data":"7f1407a447829d4b3c929b5014a86c6995a3deba80fe2ce9adc4e1a4438b552e"} Apr 16 18:20:49.523552 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.523507 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a76b197-3ec2-40d0-a268-c2dde55da620","Type":"ContainerStarted","Data":"afd5f8dd55ac652d1fd938fb4b279af84c9a5ce13dbe5645d587773a1060208d"} Apr 16 18:20:49.523552 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.523514 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a76b197-3ec2-40d0-a268-c2dde55da620","Type":"ContainerStarted","Data":"aa1744eaa11aa108e723ab2167bb02f56e91078f78ee80bd6653fc79890e5f25"} Apr 16 18:20:49.551784 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.551730 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.551712951 podStartE2EDuration="2.551712951s" podCreationTimestamp="2026-04-16 18:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:49.549938012 +0000 UTC m=+260.635387416" watchObservedRunningTime="2026-04-16 18:20:49.551712951 +0000 UTC m=+260.637162342" Apr 16 18:20:49.746826 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.746791 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-cb7556664-bmkxb"] Apr 16 18:20:49.750317 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.750301 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.752864 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.752842 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:20:49.752970 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.752852 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:20:49.753727 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.753509 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-j7r2l\"" Apr 16 18:20:49.753727 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.753580 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:20:49.754387 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.753892 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:20:49.754387 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.754257 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:20:49.761939 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.761905 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:20:49.763246 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.763223 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-cb7556664-bmkxb"] Apr 16 18:20:49.847720 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.847624 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-telemeter-trusted-ca-bundle\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.847720 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.847707 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbxrz\" (UniqueName: \"kubernetes.io/projected/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-kube-api-access-lbxrz\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.847904 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.847733 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.847904 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.847763 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-secret-telemeter-client\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.847904 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.847815 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-telemeter-client-tls\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.847904 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.847856 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-serving-certs-ca-bundle\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.847904 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.847876 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-federate-client-tls\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.847904 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.847891 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-metrics-client-ca\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.949297 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.949260 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-secret-telemeter-client\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.949413 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.949304 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-telemeter-client-tls\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.949413 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.949338 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-serving-certs-ca-bundle\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.949413 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.949364 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-federate-client-tls\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.949413 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.949379 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-metrics-client-ca\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.949413 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.949400 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-telemeter-trusted-ca-bundle\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.949623 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.949469 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbxrz\" (UniqueName: \"kubernetes.io/projected/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-kube-api-access-lbxrz\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.949623 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.949505 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.950233 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.950180 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-metrics-client-ca\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.950369 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.950343 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-serving-certs-ca-bundle\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.950426 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.950384 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-telemeter-trusted-ca-bundle\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.952030 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.952005 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-federate-client-tls\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.952122 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.952016 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-secret-telemeter-client\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.952165 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.952146 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.952265 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.952247 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-telemeter-client-tls\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:49.960134 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:49.960112 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbxrz\" (UniqueName: \"kubernetes.io/projected/322a4c8c-9e9e-4c9c-9edc-04fb1081dc99-kube-api-access-lbxrz\") pod \"telemeter-client-cb7556664-bmkxb\" (UID: \"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99\") " pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:50.016381 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.016343 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:50.016855 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.016824 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="prometheus" containerID="cri-o://7da9a1db543a25c41e5915e29345de8cc01dc8df0b4ac77d5c95d4b901cfeb3d" gracePeriod=600 Apr 16 18:20:50.016991 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.016858 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="thanos-sidecar" containerID="cri-o://70217f856ff192114119a2793d56b01b0efdbfe7e0df28fc87986cdb488dde88" gracePeriod=600 Apr 16 18:20:50.016991 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.016860 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy" containerID="cri-o://c5ba245a2df2f9026b9d96d9e12b64b0b6616ce7aa9959832f45fec0ccea7697" gracePeriod=600 Apr 16 18:20:50.016991 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.016885 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy-thanos" containerID="cri-o://38f89b64f9c34c84787732301b2dd5be226265bff4949e9b35f342ec1194557c" gracePeriod=600 Apr 16 18:20:50.016991 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.016891 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="config-reloader" containerID="cri-o://ca5b44c0f3a692d12358376d9420816cd24ebb761ebf63eb8757752f009449a1" gracePeriod=600 Apr 16 18:20:50.016991 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.016966 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy-web" containerID="cri-o://560859b45c85ee3369d06784219eb58e4766d8d7c55fc01b9fcca9a09c49a65a" gracePeriod=600 Apr 16 18:20:50.064922 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.064889 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" Apr 16 18:20:50.192278 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.192232 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-cb7556664-bmkxb"] Apr 16 18:20:50.194330 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:20:50.194304 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod322a4c8c_9e9e_4c9c_9edc_04fb1081dc99.slice/crio-b6950ef900e264b2359302cd6c48660481dd48ce11034bc6968a4daeba0b9c66 WatchSource:0}: Error finding container b6950ef900e264b2359302cd6c48660481dd48ce11034bc6968a4daeba0b9c66: Status 404 returned error can't find the container with id b6950ef900e264b2359302cd6c48660481dd48ce11034bc6968a4daeba0b9c66 Apr 16 18:20:50.526597 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526568 2562 generic.go:358] "Generic (PLEG): container finished" podID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerID="38f89b64f9c34c84787732301b2dd5be226265bff4949e9b35f342ec1194557c" exitCode=0 Apr 16 18:20:50.526597 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526592 2562 generic.go:358] "Generic (PLEG): container finished" podID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerID="c5ba245a2df2f9026b9d96d9e12b64b0b6616ce7aa9959832f45fec0ccea7697" exitCode=0 Apr 16 18:20:50.526597 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526598 2562 generic.go:358] "Generic (PLEG): container finished" podID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerID="70217f856ff192114119a2793d56b01b0efdbfe7e0df28fc87986cdb488dde88" exitCode=0 Apr 16 18:20:50.526597 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526604 2562 generic.go:358] "Generic (PLEG): container finished" podID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerID="ca5b44c0f3a692d12358376d9420816cd24ebb761ebf63eb8757752f009449a1" exitCode=0 Apr 16 18:20:50.526597 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526609 2562 generic.go:358] "Generic (PLEG): container finished" podID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerID="7da9a1db543a25c41e5915e29345de8cc01dc8df0b4ac77d5c95d4b901cfeb3d" exitCode=0 Apr 16 18:20:50.527256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526640 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerDied","Data":"38f89b64f9c34c84787732301b2dd5be226265bff4949e9b35f342ec1194557c"} Apr 16 18:20:50.527256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526678 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerDied","Data":"c5ba245a2df2f9026b9d96d9e12b64b0b6616ce7aa9959832f45fec0ccea7697"} Apr 16 18:20:50.527256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526694 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerDied","Data":"70217f856ff192114119a2793d56b01b0efdbfe7e0df28fc87986cdb488dde88"} Apr 16 18:20:50.527256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526707 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerDied","Data":"ca5b44c0f3a692d12358376d9420816cd24ebb761ebf63eb8757752f009449a1"} Apr 16 18:20:50.527256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.526721 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerDied","Data":"7da9a1db543a25c41e5915e29345de8cc01dc8df0b4ac77d5c95d4b901cfeb3d"} Apr 16 18:20:50.527719 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:50.527700 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" event={"ID":"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99","Type":"ContainerStarted","Data":"b6950ef900e264b2359302cd6c48660481dd48ce11034bc6968a4daeba0b9c66"} Apr 16 18:20:51.534630 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.534592 2562 generic.go:358] "Generic (PLEG): container finished" podID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerID="560859b45c85ee3369d06784219eb58e4766d8d7c55fc01b9fcca9a09c49a65a" exitCode=0 Apr 16 18:20:51.535043 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.534656 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerDied","Data":"560859b45c85ee3369d06784219eb58e4766d8d7c55fc01b9fcca9a09c49a65a"} Apr 16 18:20:51.690913 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.690888 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:51.868325 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868305 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-rulefiles-0\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868417 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868344 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868417 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868380 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config-out\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868417 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868410 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-tls\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868567 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868439 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-trusted-ca-bundle\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868567 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868473 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-serving-certs-ca-bundle\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868567 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868510 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-kubelet-serving-ca-bundle\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868567 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868536 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868567 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868562 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-metrics-client-certs\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868800 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868611 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-thanos-prometheus-http-client-file\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868800 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868646 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-metrics-client-ca\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868800 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868671 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8gcr\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-kube-api-access-h8gcr\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868800 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868722 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-kube-rbac-proxy\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868800 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868756 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-tls-assets\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.868800 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868780 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-grpc-tls\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.869066 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868809 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-db\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.869066 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868872 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-web-config\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.869066 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.868903 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config\") pod \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\" (UID: \"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141\") " Apr 16 18:20:51.869510 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.869475 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:51.869672 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.869648 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:51.869742 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.869678 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:51.869889 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.869860 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:51.870940 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.870673 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:51.871724 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.871634 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:51.873782 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.872547 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config-out" (OuterVolumeSpecName: "config-out") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:51.873782 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.872675 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:51.873782 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.872704 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.873782 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.872757 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.873782 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.872820 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config" (OuterVolumeSpecName: "config") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.873782 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.873466 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.873782 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.873528 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-kube-api-access-h8gcr" (OuterVolumeSpecName: "kube-api-access-h8gcr") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "kube-api-access-h8gcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:51.874222 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.873834 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.874386 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.874357 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.874386 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.874371 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.875165 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.875133 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.886140 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.886105 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-web-config" (OuterVolumeSpecName: "web-config") pod "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" (UID: "1783d49b-8e3f-4ed3-be87-7b8a3c1d4141"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970280 2562 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-kube-rbac-proxy\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970312 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-tls-assets\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970328 2562 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-grpc-tls\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970340 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-db\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970353 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-web-config\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970364 2562 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970375 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970389 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970401 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-config-out\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970417 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970431 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970444 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970459 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970472 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970486 2562 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-secret-metrics-client-certs\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970498 2562 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970512 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-configmap-metrics-client-ca\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:51.972204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:51.970527 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8gcr\" (UniqueName: \"kubernetes.io/projected/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141-kube-api-access-h8gcr\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:20:52.539461 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.539424 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" event={"ID":"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99","Type":"ContainerStarted","Data":"d5ee23c80f25bef41d78b6428668cc59ccdec062f0024534484f1d498b0545af"} Apr 16 18:20:52.539461 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.539468 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" event={"ID":"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99","Type":"ContainerStarted","Data":"b0755c26bbb48994b47db5d58880c2205ede21d9987bd30ad3be8a29e796a0a7"} Apr 16 18:20:52.539964 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.539483 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" event={"ID":"322a4c8c-9e9e-4c9c-9edc-04fb1081dc99","Type":"ContainerStarted","Data":"ff7376e31b8e0783f76238188d64f9c961560b9484e6084a62985ade12bab158"} Apr 16 18:20:52.541880 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.541857 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1783d49b-8e3f-4ed3-be87-7b8a3c1d4141","Type":"ContainerDied","Data":"95410369aff54908cc69e48ce39bc77b0fac8c2f93280ac1787cc479d6b223e4"} Apr 16 18:20:52.542001 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.541891 2562 scope.go:117] "RemoveContainer" containerID="38f89b64f9c34c84787732301b2dd5be226265bff4949e9b35f342ec1194557c" Apr 16 18:20:52.542001 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.541923 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.552789 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.552767 2562 scope.go:117] "RemoveContainer" containerID="c5ba245a2df2f9026b9d96d9e12b64b0b6616ce7aa9959832f45fec0ccea7697" Apr 16 18:20:52.559878 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.559857 2562 scope.go:117] "RemoveContainer" containerID="560859b45c85ee3369d06784219eb58e4766d8d7c55fc01b9fcca9a09c49a65a" Apr 16 18:20:52.566375 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.566355 2562 scope.go:117] "RemoveContainer" containerID="70217f856ff192114119a2793d56b01b0efdbfe7e0df28fc87986cdb488dde88" Apr 16 18:20:52.567639 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.567590 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-cb7556664-bmkxb" podStartSLOduration=2.038029049 podStartE2EDuration="3.567573898s" podCreationTimestamp="2026-04-16 18:20:49 +0000 UTC" firstStartedPulling="2026-04-16 18:20:50.196077191 +0000 UTC m=+261.281526560" lastFinishedPulling="2026-04-16 18:20:51.725622036 +0000 UTC m=+262.811071409" observedRunningTime="2026-04-16 18:20:52.565843884 +0000 UTC m=+263.651293276" watchObservedRunningTime="2026-04-16 18:20:52.567573898 +0000 UTC m=+263.653023290" Apr 16 18:20:52.574705 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.574687 2562 scope.go:117] "RemoveContainer" containerID="ca5b44c0f3a692d12358376d9420816cd24ebb761ebf63eb8757752f009449a1" Apr 16 18:20:52.581256 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.581241 2562 scope.go:117] "RemoveContainer" containerID="7da9a1db543a25c41e5915e29345de8cc01dc8df0b4ac77d5c95d4b901cfeb3d" Apr 16 18:20:52.586067 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.586046 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:52.591018 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.589653 2562 scope.go:117] "RemoveContainer" containerID="dda1dea3c25e5dea0f0c61b7647b0cca4aa2d5ab33e477a1ea7d68704b83bcf3" Apr 16 18:20:52.594332 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.594309 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:52.618309 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618271 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:52.618675 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618661 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="prometheus" Apr 16 18:20:52.618730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618678 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="prometheus" Apr 16 18:20:52.618730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618687 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy" Apr 16 18:20:52.618730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618693 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy" Apr 16 18:20:52.618730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618701 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="thanos-sidecar" Apr 16 18:20:52.618730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618707 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="thanos-sidecar" Apr 16 18:20:52.618730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618713 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy-web" Apr 16 18:20:52.618730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618720 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy-web" Apr 16 18:20:52.618730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618730 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy-thanos" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618735 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy-thanos" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618744 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="init-config-reloader" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618749 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="init-config-reloader" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618762 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="config-reloader" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618767 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="config-reloader" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618813 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy-thanos" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618821 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="config-reloader" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618828 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618835 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="prometheus" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618840 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="kube-rbac-proxy-web" Apr 16 18:20:52.618981 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.618847 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" containerName="thanos-sidecar" Apr 16 18:20:52.623720 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.623704 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.626054 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626033 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:20:52.626133 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626035 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:20:52.626243 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626217 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:20:52.626335 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626322 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:20:52.626624 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626610 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:20:52.626725 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626707 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:20:52.626783 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626719 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:20:52.626783 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626707 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:20:52.626876 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.626863 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:20:52.627167 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.627149 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:20:52.627270 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.627149 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bu6cc6r7m8ird\"" Apr 16 18:20:52.627270 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.627155 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-r6rnm\"" Apr 16 18:20:52.629717 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.629697 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:20:52.632718 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.632698 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:20:52.636953 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.636930 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:52.776855 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.776813 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.776855 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.776857 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777059 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.776913 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aedd43ae-ab04-46bb-a249-ad6749c68f29-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777059 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.776941 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777059 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777022 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-config\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777059 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777047 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777067 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777108 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-web-config\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777128 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777144 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777385 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777225 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aedd43ae-ab04-46bb-a249-ad6749c68f29-config-out\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777385 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777251 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777385 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777275 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777385 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777301 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777385 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777326 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777385 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777362 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777546 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777385 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27vxg\" (UniqueName: \"kubernetes.io/projected/aedd43ae-ab04-46bb-a249-ad6749c68f29-kube-api-access-27vxg\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.777546 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.777441 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.878323 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878219 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aedd43ae-ab04-46bb-a249-ad6749c68f29-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.878323 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878278 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.878323 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878323 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-config\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.878598 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878353 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.878598 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878382 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.878598 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878515 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-web-config\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.878598 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878550 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.878598 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878595 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.879019 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.878632 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aedd43ae-ab04-46bb-a249-ad6749c68f29-config-out\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.879359 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.879340 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.879614 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.879251 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.879614 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.879395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.879824 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.879801 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880081 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.879545 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.880128 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.880161 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880306 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.880218 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880306 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.880248 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27vxg\" (UniqueName: \"kubernetes.io/projected/aedd43ae-ab04-46bb-a249-ad6749c68f29-kube-api-access-27vxg\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880306 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.880298 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880457 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.880334 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880457 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.880373 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.880927 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.880807 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.881874 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.881825 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.881874 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.881860 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-web-config\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.882081 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.882055 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aedd43ae-ab04-46bb-a249-ad6749c68f29-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.882433 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.882393 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.882920 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.882898 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aedd43ae-ab04-46bb-a249-ad6749c68f29-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.883067 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.882945 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-config\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.883216 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.883176 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.883638 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.883614 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.883812 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.883788 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.884294 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.884266 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.884579 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.884560 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aedd43ae-ab04-46bb-a249-ad6749c68f29-config-out\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.884671 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.884651 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.885118 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.885099 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aedd43ae-ab04-46bb-a249-ad6749c68f29-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.889182 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.889164 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27vxg\" (UniqueName: \"kubernetes.io/projected/aedd43ae-ab04-46bb-a249-ad6749c68f29-kube-api-access-27vxg\") pod \"prometheus-k8s-0\" (UID: \"aedd43ae-ab04-46bb-a249-ad6749c68f29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:52.934168 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:52.934127 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:53.065218 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:53.065165 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:53.068224 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:20:53.068177 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaedd43ae_ab04_46bb_a249_ad6749c68f29.slice/crio-8dda4acf278bed497ec80f8e96d3143c3f932fd46c76a11154ae98dfc80d3894 WatchSource:0}: Error finding container 8dda4acf278bed497ec80f8e96d3143c3f932fd46c76a11154ae98dfc80d3894: Status 404 returned error can't find the container with id 8dda4acf278bed497ec80f8e96d3143c3f932fd46c76a11154ae98dfc80d3894 Apr 16 18:20:53.523974 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:53.523938 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1783d49b-8e3f-4ed3-be87-7b8a3c1d4141" path="/var/lib/kubelet/pods/1783d49b-8e3f-4ed3-be87-7b8a3c1d4141/volumes" Apr 16 18:20:53.546764 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:53.546728 2562 generic.go:358] "Generic (PLEG): container finished" podID="aedd43ae-ab04-46bb-a249-ad6749c68f29" containerID="610a2ded13e8879c892a66c5f5cd2b53fc2cd713aafe359f00f4984956e9ea66" exitCode=0 Apr 16 18:20:53.547121 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:53.546816 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aedd43ae-ab04-46bb-a249-ad6749c68f29","Type":"ContainerDied","Data":"610a2ded13e8879c892a66c5f5cd2b53fc2cd713aafe359f00f4984956e9ea66"} Apr 16 18:20:53.547121 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:53.546846 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aedd43ae-ab04-46bb-a249-ad6749c68f29","Type":"ContainerStarted","Data":"8dda4acf278bed497ec80f8e96d3143c3f932fd46c76a11154ae98dfc80d3894"} Apr 16 18:20:54.553014 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:54.552976 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aedd43ae-ab04-46bb-a249-ad6749c68f29","Type":"ContainerStarted","Data":"79d5145abba35d8bed5d7dbf28e0d5e969afd4ec9c8f15f58894d7ec11061baf"} Apr 16 18:20:54.553014 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:54.553016 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aedd43ae-ab04-46bb-a249-ad6749c68f29","Type":"ContainerStarted","Data":"235903cef05edb52532cf2207d653adfd39d47d3438269596e2c2fd84de11e82"} Apr 16 18:20:54.553596 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:54.553026 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aedd43ae-ab04-46bb-a249-ad6749c68f29","Type":"ContainerStarted","Data":"79c85dc241cd470596db7c7f1f6ebc72a89f279403431deb53cc8164661f28e5"} Apr 16 18:20:54.553596 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:54.553034 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aedd43ae-ab04-46bb-a249-ad6749c68f29","Type":"ContainerStarted","Data":"0ececf95ddf1a7d95fe248e3951e19df5fd69ad692d76cfdb66092fc209a731a"} Apr 16 18:20:54.553596 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:54.553043 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aedd43ae-ab04-46bb-a249-ad6749c68f29","Type":"ContainerStarted","Data":"be5153cf1935eb966a97e115cca93bcc12db89cf6a8d11a2f5ea24332fc66ce5"} Apr 16 18:20:54.553596 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:54.553051 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aedd43ae-ab04-46bb-a249-ad6749c68f29","Type":"ContainerStarted","Data":"fd3c17221703ed4ae6070b26210569b029413fd4772fc50873416245323437cc"} Apr 16 18:20:54.581732 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:54.581679 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.581664327 podStartE2EDuration="2.581664327s" podCreationTimestamp="2026-04-16 18:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:54.580107998 +0000 UTC m=+265.665557443" watchObservedRunningTime="2026-04-16 18:20:54.581664327 +0000 UTC m=+265.667113737" Apr 16 18:20:57.934894 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:20:57.934835 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:29.401848 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:21:29.401815 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:21:29.409249 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:21:29.408941 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:21:29.416461 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:21:29.416435 2562 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:21:52.935378 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:21:52.935320 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:52.951064 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:21:52.951035 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:53.752885 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:21:53.752852 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:26:29.434944 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:29.434915 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:26:29.436072 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:29.436036 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:26:43.854745 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.854710 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-z9f8t"] Apr 16 18:26:43.858030 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.858011 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z9f8t" Apr 16 18:26:43.860646 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.860623 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:26:43.860792 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.860679 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:26:43.860792 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.860764 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:26:43.860885 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.860848 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-8fqk7\"" Apr 16 18:26:43.865475 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.865452 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-z9f8t"] Apr 16 18:26:43.897416 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.897374 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxlzs\" (UniqueName: \"kubernetes.io/projected/8819646c-6541-4ede-a666-69bdf0884ce6-kube-api-access-lxlzs\") pod \"s3-init-z9f8t\" (UID: \"8819646c-6541-4ede-a666-69bdf0884ce6\") " pod="kserve/s3-init-z9f8t" Apr 16 18:26:43.998534 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:43.998495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxlzs\" (UniqueName: \"kubernetes.io/projected/8819646c-6541-4ede-a666-69bdf0884ce6-kube-api-access-lxlzs\") pod \"s3-init-z9f8t\" (UID: \"8819646c-6541-4ede-a666-69bdf0884ce6\") " pod="kserve/s3-init-z9f8t" Apr 16 18:26:44.008648 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:44.008613 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxlzs\" (UniqueName: \"kubernetes.io/projected/8819646c-6541-4ede-a666-69bdf0884ce6-kube-api-access-lxlzs\") pod \"s3-init-z9f8t\" (UID: \"8819646c-6541-4ede-a666-69bdf0884ce6\") " pod="kserve/s3-init-z9f8t" Apr 16 18:26:44.178966 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:44.178930 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z9f8t" Apr 16 18:26:44.301855 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:44.301823 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-z9f8t"] Apr 16 18:26:44.305145 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:26:44.305114 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8819646c_6541_4ede_a666_69bdf0884ce6.slice/crio-15bf8c9d75e8f41c4d040c6d0d6071015317e637010467f39a2dd572fe48bd61 WatchSource:0}: Error finding container 15bf8c9d75e8f41c4d040c6d0d6071015317e637010467f39a2dd572fe48bd61: Status 404 returned error can't find the container with id 15bf8c9d75e8f41c4d040c6d0d6071015317e637010467f39a2dd572fe48bd61 Apr 16 18:26:44.307318 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:44.307301 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:26:44.572993 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:44.572910 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z9f8t" event={"ID":"8819646c-6541-4ede-a666-69bdf0884ce6","Type":"ContainerStarted","Data":"15bf8c9d75e8f41c4d040c6d0d6071015317e637010467f39a2dd572fe48bd61"} Apr 16 18:26:49.593065 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:49.593024 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z9f8t" event={"ID":"8819646c-6541-4ede-a666-69bdf0884ce6","Type":"ContainerStarted","Data":"237c66dec001012e48a7dfbe467a3acc0596f19c3dbe3eda257d87cf4de1299a"} Apr 16 18:26:49.608561 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:49.608454 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-z9f8t" podStartSLOduration=2.108188306 podStartE2EDuration="6.608434044s" podCreationTimestamp="2026-04-16 18:26:43 +0000 UTC" firstStartedPulling="2026-04-16 18:26:44.307427039 +0000 UTC m=+615.392876408" lastFinishedPulling="2026-04-16 18:26:48.80767277 +0000 UTC m=+619.893122146" observedRunningTime="2026-04-16 18:26:49.60796392 +0000 UTC m=+620.693413314" watchObservedRunningTime="2026-04-16 18:26:49.608434044 +0000 UTC m=+620.693883448" Apr 16 18:26:52.607204 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:52.607158 2562 generic.go:358] "Generic (PLEG): container finished" podID="8819646c-6541-4ede-a666-69bdf0884ce6" containerID="237c66dec001012e48a7dfbe467a3acc0596f19c3dbe3eda257d87cf4de1299a" exitCode=0 Apr 16 18:26:52.607590 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:52.607218 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z9f8t" event={"ID":"8819646c-6541-4ede-a666-69bdf0884ce6","Type":"ContainerDied","Data":"237c66dec001012e48a7dfbe467a3acc0596f19c3dbe3eda257d87cf4de1299a"} Apr 16 18:26:53.730712 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:53.730688 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z9f8t" Apr 16 18:26:53.783892 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:53.783861 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxlzs\" (UniqueName: \"kubernetes.io/projected/8819646c-6541-4ede-a666-69bdf0884ce6-kube-api-access-lxlzs\") pod \"8819646c-6541-4ede-a666-69bdf0884ce6\" (UID: \"8819646c-6541-4ede-a666-69bdf0884ce6\") " Apr 16 18:26:53.786074 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:53.786049 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8819646c-6541-4ede-a666-69bdf0884ce6-kube-api-access-lxlzs" (OuterVolumeSpecName: "kube-api-access-lxlzs") pod "8819646c-6541-4ede-a666-69bdf0884ce6" (UID: "8819646c-6541-4ede-a666-69bdf0884ce6"). InnerVolumeSpecName "kube-api-access-lxlzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:53.884435 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:53.884353 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lxlzs\" (UniqueName: \"kubernetes.io/projected/8819646c-6541-4ede-a666-69bdf0884ce6-kube-api-access-lxlzs\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:26:54.613854 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:54.613824 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z9f8t" Apr 16 18:26:54.614031 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:54.613823 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z9f8t" event={"ID":"8819646c-6541-4ede-a666-69bdf0884ce6","Type":"ContainerDied","Data":"15bf8c9d75e8f41c4d040c6d0d6071015317e637010467f39a2dd572fe48bd61"} Apr 16 18:26:54.614031 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:26:54.613934 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15bf8c9d75e8f41c4d040c6d0d6071015317e637010467f39a2dd572fe48bd61" Apr 16 18:31:29.457856 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:31:29.457772 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:31:29.458441 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:31:29.458022 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:32:08.015171 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.015128 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b"] Apr 16 18:32:08.017672 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.015615 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8819646c-6541-4ede-a666-69bdf0884ce6" containerName="s3-init" Apr 16 18:32:08.017672 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.015635 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8819646c-6541-4ede-a666-69bdf0884ce6" containerName="s3-init" Apr 16 18:32:08.017672 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.015690 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8819646c-6541-4ede-a666-69bdf0884ce6" containerName="s3-init" Apr 16 18:32:08.018536 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.018517 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" Apr 16 18:32:08.021418 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.021393 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-842jm\"" Apr 16 18:32:08.026107 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.026085 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b"] Apr 16 18:32:08.029325 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.029309 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" Apr 16 18:32:08.157477 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.157450 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b"] Apr 16 18:32:08.159852 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:32:08.159819 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb85259_842c_4393_bf90_574950f82c62.slice/crio-43cfaae34f7a81f64d77ac44013f7ea0a353cfbd332f316dcbb16de8162fa960 WatchSource:0}: Error finding container 43cfaae34f7a81f64d77ac44013f7ea0a353cfbd332f316dcbb16de8162fa960: Status 404 returned error can't find the container with id 43cfaae34f7a81f64d77ac44013f7ea0a353cfbd332f316dcbb16de8162fa960 Apr 16 18:32:08.161621 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.161606 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:32:08.528515 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:08.528477 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" event={"ID":"cfb85259-842c-4393-bf90-574950f82c62","Type":"ContainerStarted","Data":"43cfaae34f7a81f64d77ac44013f7ea0a353cfbd332f316dcbb16de8162fa960"} Apr 16 18:32:09.533159 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:09.533069 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" event={"ID":"cfb85259-842c-4393-bf90-574950f82c62","Type":"ContainerStarted","Data":"4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82"} Apr 16 18:32:09.533631 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:09.533220 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" Apr 16 18:32:09.535180 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:09.535164 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" Apr 16 18:32:09.548243 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:32:09.548206 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" podStartSLOduration=0.429379765 podStartE2EDuration="1.548175399s" podCreationTimestamp="2026-04-16 18:32:08 +0000 UTC" firstStartedPulling="2026-04-16 18:32:08.161734246 +0000 UTC m=+939.247183616" lastFinishedPulling="2026-04-16 18:32:09.280529866 +0000 UTC m=+940.365979250" observedRunningTime="2026-04-16 18:32:09.547764646 +0000 UTC m=+940.633214038" watchObservedRunningTime="2026-04-16 18:32:09.548175399 +0000 UTC m=+940.633624790" Apr 16 18:33:43.092288 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.092253 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b_cfb85259-842c-4393-bf90-574950f82c62/kserve-container/0.log" Apr 16 18:33:43.404560 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.404482 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b"] Apr 16 18:33:43.404726 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.404704 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" podUID="cfb85259-842c-4393-bf90-574950f82c62" containerName="kserve-container" containerID="cri-o://4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82" gracePeriod=30 Apr 16 18:33:43.645491 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.645468 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" Apr 16 18:33:43.807599 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.807564 2562 generic.go:358] "Generic (PLEG): container finished" podID="cfb85259-842c-4393-bf90-574950f82c62" containerID="4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82" exitCode=2 Apr 16 18:33:43.807789 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.807630 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" Apr 16 18:33:43.807789 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.807645 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" event={"ID":"cfb85259-842c-4393-bf90-574950f82c62","Type":"ContainerDied","Data":"4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82"} Apr 16 18:33:43.807789 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.807680 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b" event={"ID":"cfb85259-842c-4393-bf90-574950f82c62","Type":"ContainerDied","Data":"43cfaae34f7a81f64d77ac44013f7ea0a353cfbd332f316dcbb16de8162fa960"} Apr 16 18:33:43.807789 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.807695 2562 scope.go:117] "RemoveContainer" containerID="4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82" Apr 16 18:33:43.816060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.816033 2562 scope.go:117] "RemoveContainer" containerID="4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82" Apr 16 18:33:43.816355 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:33:43.816335 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82\": container with ID starting with 4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82 not found: ID does not exist" containerID="4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82" Apr 16 18:33:43.816411 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.816364 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82"} err="failed to get container status \"4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82\": rpc error: code = NotFound desc = could not find container \"4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82\": container with ID starting with 4aac8d33538546e993006a90b6791e86b6d8d955467bb9a51a2a04a972187b82 not found: ID does not exist" Apr 16 18:33:43.830178 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.830149 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b"] Apr 16 18:33:43.833736 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:43.833711 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1141a-predictor-5446b64f7b-rvn2b"] Apr 16 18:33:45.523914 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:33:45.523875 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb85259-842c-4393-bf90-574950f82c62" path="/var/lib/kubelet/pods/cfb85259-842c-4393-bf90-574950f82c62/volumes" Apr 16 18:36:29.480984 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:36:29.480904 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:36:29.481961 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:36:29.481940 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:40:42.079914 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.079877 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rbf9c/must-gather-299rg"] Apr 16 18:40:42.080685 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.080226 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfb85259-842c-4393-bf90-574950f82c62" containerName="kserve-container" Apr 16 18:40:42.080685 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.080239 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb85259-842c-4393-bf90-574950f82c62" containerName="kserve-container" Apr 16 18:40:42.080685 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.080305 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfb85259-842c-4393-bf90-574950f82c62" containerName="kserve-container" Apr 16 18:40:42.083473 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.083457 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:40:42.086099 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.086076 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rbf9c\"/\"kube-root-ca.crt\"" Apr 16 18:40:42.087143 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.087098 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rbf9c\"/\"openshift-service-ca.crt\"" Apr 16 18:40:42.087338 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.087144 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rbf9c\"/\"default-dockercfg-m6bxr\"" Apr 16 18:40:42.089322 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.089297 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rbf9c/must-gather-299rg"] Apr 16 18:40:42.213737 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.213694 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wvfp\" (UniqueName: \"kubernetes.io/projected/70f4f070-2162-4d20-9647-262e8cc2a281-kube-api-access-7wvfp\") pod \"must-gather-299rg\" (UID: \"70f4f070-2162-4d20-9647-262e8cc2a281\") " pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:40:42.213921 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.213765 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70f4f070-2162-4d20-9647-262e8cc2a281-must-gather-output\") pod \"must-gather-299rg\" (UID: \"70f4f070-2162-4d20-9647-262e8cc2a281\") " pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:40:42.314319 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.314285 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wvfp\" (UniqueName: \"kubernetes.io/projected/70f4f070-2162-4d20-9647-262e8cc2a281-kube-api-access-7wvfp\") pod \"must-gather-299rg\" (UID: \"70f4f070-2162-4d20-9647-262e8cc2a281\") " pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:40:42.314535 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.314377 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70f4f070-2162-4d20-9647-262e8cc2a281-must-gather-output\") pod \"must-gather-299rg\" (UID: \"70f4f070-2162-4d20-9647-262e8cc2a281\") " pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:40:42.314724 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.314704 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70f4f070-2162-4d20-9647-262e8cc2a281-must-gather-output\") pod \"must-gather-299rg\" (UID: \"70f4f070-2162-4d20-9647-262e8cc2a281\") " pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:40:42.323202 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.323157 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wvfp\" (UniqueName: \"kubernetes.io/projected/70f4f070-2162-4d20-9647-262e8cc2a281-kube-api-access-7wvfp\") pod \"must-gather-299rg\" (UID: \"70f4f070-2162-4d20-9647-262e8cc2a281\") " pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:40:42.403003 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.402917 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:40:42.526117 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.526077 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rbf9c/must-gather-299rg"] Apr 16 18:40:42.528862 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:40:42.528836 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70f4f070_2162_4d20_9647_262e8cc2a281.slice/crio-823e7bfb0a017e7e0bcb892ba683e63a33865e6c40e515494cc9786ddc23acb1 WatchSource:0}: Error finding container 823e7bfb0a017e7e0bcb892ba683e63a33865e6c40e515494cc9786ddc23acb1: Status 404 returned error can't find the container with id 823e7bfb0a017e7e0bcb892ba683e63a33865e6c40e515494cc9786ddc23acb1 Apr 16 18:40:42.530495 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:42.530479 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:40:43.057274 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:43.057236 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbf9c/must-gather-299rg" event={"ID":"70f4f070-2162-4d20-9647-262e8cc2a281","Type":"ContainerStarted","Data":"823e7bfb0a017e7e0bcb892ba683e63a33865e6c40e515494cc9786ddc23acb1"} Apr 16 18:40:47.072876 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:47.072828 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbf9c/must-gather-299rg" event={"ID":"70f4f070-2162-4d20-9647-262e8cc2a281","Type":"ContainerStarted","Data":"9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812"} Apr 16 18:40:47.072876 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:47.072875 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbf9c/must-gather-299rg" event={"ID":"70f4f070-2162-4d20-9647-262e8cc2a281","Type":"ContainerStarted","Data":"7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4"} Apr 16 18:40:47.096993 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:40:47.096928 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rbf9c/must-gather-299rg" podStartSLOduration=1.020683015 podStartE2EDuration="5.096907039s" podCreationTimestamp="2026-04-16 18:40:42 +0000 UTC" firstStartedPulling="2026-04-16 18:40:42.530601434 +0000 UTC m=+1453.616050802" lastFinishedPulling="2026-04-16 18:40:46.606825453 +0000 UTC m=+1457.692274826" observedRunningTime="2026-04-16 18:40:47.095438659 +0000 UTC m=+1458.180888055" watchObservedRunningTime="2026-04-16 18:40:47.096907039 +0000 UTC m=+1458.182356430" Apr 16 18:41:05.136583 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:05.136545 2562 generic.go:358] "Generic (PLEG): container finished" podID="70f4f070-2162-4d20-9647-262e8cc2a281" containerID="7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4" exitCode=0 Apr 16 18:41:05.136583 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:05.136587 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbf9c/must-gather-299rg" event={"ID":"70f4f070-2162-4d20-9647-262e8cc2a281","Type":"ContainerDied","Data":"7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4"} Apr 16 18:41:05.137036 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:05.136889 2562 scope.go:117] "RemoveContainer" containerID="7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4" Apr 16 18:41:05.937032 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:05.937003 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbf9c_must-gather-299rg_70f4f070-2162-4d20-9647-262e8cc2a281/gather/0.log" Apr 16 18:41:09.296407 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:09.296377 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-q5k8m_97a27848-fb3c-407c-8213-4e08944e760a/global-pull-secret-syncer/0.log" Apr 16 18:41:09.465476 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:09.465443 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pbnvm_5f7e2ea0-df8c-485d-a95a-e622c53fab2d/konnectivity-agent/0.log" Apr 16 18:41:09.490626 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:09.490575 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-68.ec2.internal_69d4f0efd48f93f6cc380a4943e78ab2/haproxy/0.log" Apr 16 18:41:11.387769 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.387734 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rbf9c/must-gather-299rg"] Apr 16 18:41:11.388182 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.387947 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rbf9c/must-gather-299rg" podUID="70f4f070-2162-4d20-9647-262e8cc2a281" containerName="copy" containerID="cri-o://9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812" gracePeriod=2 Apr 16 18:41:11.392900 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.392865 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rbf9c/must-gather-299rg"] Apr 16 18:41:11.612306 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.612278 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbf9c_must-gather-299rg_70f4f070-2162-4d20-9647-262e8cc2a281/copy/0.log" Apr 16 18:41:11.612650 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.612633 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:41:11.674376 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.674350 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70f4f070-2162-4d20-9647-262e8cc2a281-must-gather-output\") pod \"70f4f070-2162-4d20-9647-262e8cc2a281\" (UID: \"70f4f070-2162-4d20-9647-262e8cc2a281\") " Apr 16 18:41:11.674530 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.674428 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wvfp\" (UniqueName: \"kubernetes.io/projected/70f4f070-2162-4d20-9647-262e8cc2a281-kube-api-access-7wvfp\") pod \"70f4f070-2162-4d20-9647-262e8cc2a281\" (UID: \"70f4f070-2162-4d20-9647-262e8cc2a281\") " Apr 16 18:41:11.675669 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.675633 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f4f070-2162-4d20-9647-262e8cc2a281-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "70f4f070-2162-4d20-9647-262e8cc2a281" (UID: "70f4f070-2162-4d20-9647-262e8cc2a281"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:11.676631 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.676611 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f4f070-2162-4d20-9647-262e8cc2a281-kube-api-access-7wvfp" (OuterVolumeSpecName: "kube-api-access-7wvfp") pod "70f4f070-2162-4d20-9647-262e8cc2a281" (UID: "70f4f070-2162-4d20-9647-262e8cc2a281"). InnerVolumeSpecName "kube-api-access-7wvfp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:41:11.775817 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.775782 2562 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70f4f070-2162-4d20-9647-262e8cc2a281-must-gather-output\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:41:11.775817 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:11.775812 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7wvfp\" (UniqueName: \"kubernetes.io/projected/70f4f070-2162-4d20-9647-262e8cc2a281-kube-api-access-7wvfp\") on node \"ip-10-0-128-68.ec2.internal\" DevicePath \"\"" Apr 16 18:41:12.159537 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.159458 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbf9c_must-gather-299rg_70f4f070-2162-4d20-9647-262e8cc2a281/copy/0.log" Apr 16 18:41:12.159805 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.159783 2562 generic.go:358] "Generic (PLEG): container finished" podID="70f4f070-2162-4d20-9647-262e8cc2a281" containerID="9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812" exitCode=143 Apr 16 18:41:12.159894 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.159842 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbf9c/must-gather-299rg" Apr 16 18:41:12.159971 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.159847 2562 scope.go:117] "RemoveContainer" containerID="9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812" Apr 16 18:41:12.167755 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.167739 2562 scope.go:117] "RemoveContainer" containerID="7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4" Apr 16 18:41:12.179667 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.179641 2562 scope.go:117] "RemoveContainer" containerID="9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812" Apr 16 18:41:12.179957 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:41:12.179938 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812\": container with ID starting with 9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812 not found: ID does not exist" containerID="9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812" Apr 16 18:41:12.180016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.179970 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812"} err="failed to get container status \"9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812\": rpc error: code = NotFound desc = could not find container \"9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812\": container with ID starting with 9433f3abeccef95147434392583dd98cfd8d4560df76d0e099d37d0ab947b812 not found: ID does not exist" Apr 16 18:41:12.180016 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.179989 2562 scope.go:117] "RemoveContainer" containerID="7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4" Apr 16 18:41:12.180648 ip-10-0-128-68 kubenswrapper[2562]: E0416 18:41:12.180623 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4\": container with ID starting with 7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4 not found: ID does not exist" containerID="7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4" Apr 16 18:41:12.180730 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.180660 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4"} err="failed to get container status \"7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4\": rpc error: code = NotFound desc = could not find container \"7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4\": container with ID starting with 7ccc9d6ad02877e882215debb455d7b57a11aa4480995f7ba05b39f76e92a1c4 not found: ID does not exist" Apr 16 18:41:12.999821 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:12.999781 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a76b197-3ec2-40d0-a268-c2dde55da620/alertmanager/0.log" Apr 16 18:41:13.026571 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.026486 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a76b197-3ec2-40d0-a268-c2dde55da620/config-reloader/0.log" Apr 16 18:41:13.051038 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.051011 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a76b197-3ec2-40d0-a268-c2dde55da620/kube-rbac-proxy-web/0.log" Apr 16 18:41:13.071337 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.071307 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a76b197-3ec2-40d0-a268-c2dde55da620/kube-rbac-proxy/0.log" Apr 16 18:41:13.094521 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.094494 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a76b197-3ec2-40d0-a268-c2dde55da620/kube-rbac-proxy-metric/0.log" Apr 16 18:41:13.116317 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.116289 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a76b197-3ec2-40d0-a268-c2dde55da620/prom-label-proxy/0.log" Apr 16 18:41:13.140897 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.140871 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a76b197-3ec2-40d0-a268-c2dde55da620/init-config-reloader/0.log" Apr 16 18:41:13.208026 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.207988 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-5h6ps_76362cf3-2b44-437b-bd99-b7048c4e3aa6/kube-state-metrics/0.log" Apr 16 18:41:13.232741 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.232708 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-5h6ps_76362cf3-2b44-437b-bd99-b7048c4e3aa6/kube-rbac-proxy-main/0.log" Apr 16 18:41:13.258174 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.258131 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-5h6ps_76362cf3-2b44-437b-bd99-b7048c4e3aa6/kube-rbac-proxy-self/0.log" Apr 16 18:41:13.282836 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.282760 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-55c9f64cb6-5j7rb_d04312b8-1687-41a0-a148-9677c142ead3/metrics-server/0.log" Apr 16 18:41:13.308713 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.308680 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-wqvh7_8c889714-3a36-4fdd-b6ba-51298672e02f/monitoring-plugin/0.log" Apr 16 18:41:13.405625 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.405595 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b5w4l_55debf3d-042d-4ec7-811c-39c4ba0d540d/node-exporter/0.log" Apr 16 18:41:13.426113 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.426082 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b5w4l_55debf3d-042d-4ec7-811c-39c4ba0d540d/kube-rbac-proxy/0.log" Apr 16 18:41:13.445570 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.445544 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b5w4l_55debf3d-042d-4ec7-811c-39c4ba0d540d/init-textfile/0.log" Apr 16 18:41:13.525221 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.524079 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f4f070-2162-4d20-9647-262e8cc2a281" path="/var/lib/kubelet/pods/70f4f070-2162-4d20-9647-262e8cc2a281/volumes" Apr 16 18:41:13.640914 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.640878 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aedd43ae-ab04-46bb-a249-ad6749c68f29/prometheus/0.log" Apr 16 18:41:13.660880 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.660844 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aedd43ae-ab04-46bb-a249-ad6749c68f29/config-reloader/0.log" Apr 16 18:41:13.681455 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.681415 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aedd43ae-ab04-46bb-a249-ad6749c68f29/thanos-sidecar/0.log" Apr 16 18:41:13.703137 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.703087 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aedd43ae-ab04-46bb-a249-ad6749c68f29/kube-rbac-proxy-web/0.log" Apr 16 18:41:13.723672 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.723644 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aedd43ae-ab04-46bb-a249-ad6749c68f29/kube-rbac-proxy/0.log" Apr 16 18:41:13.744173 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.744141 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aedd43ae-ab04-46bb-a249-ad6749c68f29/kube-rbac-proxy-thanos/0.log" Apr 16 18:41:13.768163 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.768140 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aedd43ae-ab04-46bb-a249-ad6749c68f29/init-config-reloader/0.log" Apr 16 18:41:13.872881 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.872807 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-cb7556664-bmkxb_322a4c8c-9e9e-4c9c-9edc-04fb1081dc99/telemeter-client/0.log" Apr 16 18:41:13.898874 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.898848 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-cb7556664-bmkxb_322a4c8c-9e9e-4c9c-9edc-04fb1081dc99/reload/0.log" Apr 16 18:41:13.927599 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:13.927574 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-cb7556664-bmkxb_322a4c8c-9e9e-4c9c-9edc-04fb1081dc99/kube-rbac-proxy/0.log" Apr 16 18:41:16.424166 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.424137 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-9q4ds_9292231d-6f76-4890-98d2-105390472ac1/volume-data-source-validator/0.log" Apr 16 18:41:16.429441 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.429417 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt"] Apr 16 18:41:16.429783 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.429770 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70f4f070-2162-4d20-9647-262e8cc2a281" containerName="copy" Apr 16 18:41:16.429830 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.429785 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f4f070-2162-4d20-9647-262e8cc2a281" containerName="copy" Apr 16 18:41:16.429830 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.429795 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70f4f070-2162-4d20-9647-262e8cc2a281" containerName="gather" Apr 16 18:41:16.429830 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.429801 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f4f070-2162-4d20-9647-262e8cc2a281" containerName="gather" Apr 16 18:41:16.429918 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.429848 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="70f4f070-2162-4d20-9647-262e8cc2a281" containerName="gather" Apr 16 18:41:16.429918 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.429858 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="70f4f070-2162-4d20-9647-262e8cc2a281" containerName="copy" Apr 16 18:41:16.433419 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.433400 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.436099 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.436070 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-chptg\"/\"default-dockercfg-cnqsp\"" Apr 16 18:41:16.436099 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.436087 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chptg\"/\"openshift-service-ca.crt\"" Apr 16 18:41:16.437060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.437041 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chptg\"/\"kube-root-ca.crt\"" Apr 16 18:41:16.441899 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.441878 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt"] Apr 16 18:41:16.515740 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.515699 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-sys\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.515937 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.515755 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzqnr\" (UniqueName: \"kubernetes.io/projected/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-kube-api-access-dzqnr\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.515937 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.515780 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-podres\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.515937 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.515814 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-lib-modules\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.515937 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.515843 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-proc\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.616736 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.616701 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-proc\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.616940 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.616763 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-sys\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.616940 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.616836 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzqnr\" (UniqueName: \"kubernetes.io/projected/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-kube-api-access-dzqnr\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.616940 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.616851 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-proc\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.616940 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.616870 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-sys\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.616940 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.616872 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-podres\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.617142 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.616950 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-lib-modules\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.617142 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.616987 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-podres\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.617142 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.617101 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-lib-modules\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.625415 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.625392 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzqnr\" (UniqueName: \"kubernetes.io/projected/2a211191-3589-4dcf-a4b6-b99fb9cf2a85-kube-api-access-dzqnr\") pod \"perf-node-gather-daemonset-ksjnt\" (UID: \"2a211191-3589-4dcf-a4b6-b99fb9cf2a85\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.744799 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.744697 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:16.869093 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:16.869068 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt"] Apr 16 18:41:16.871855 ip-10-0-128-68 kubenswrapper[2562]: W0416 18:41:16.871826 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2a211191_3589_4dcf_a4b6_b99fb9cf2a85.slice/crio-d43c2282f969738cec3a2da117cba5ba3dc12dd5e0d8efd5004960e03bca8936 WatchSource:0}: Error finding container d43c2282f969738cec3a2da117cba5ba3dc12dd5e0d8efd5004960e03bca8936: Status 404 returned error can't find the container with id d43c2282f969738cec3a2da117cba5ba3dc12dd5e0d8efd5004960e03bca8936 Apr 16 18:41:17.125799 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.125711 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bz65j_e33c04f3-e414-4174-8047-0f84ece6cd5d/dns/0.log" Apr 16 18:41:17.177652 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.177618 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" event={"ID":"2a211191-3589-4dcf-a4b6-b99fb9cf2a85","Type":"ContainerStarted","Data":"02f9f6626ed6377fec7978b69eae5bcc1dcbc36ca40abb5caa570d286db6f90d"} Apr 16 18:41:17.177652 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.177654 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" event={"ID":"2a211191-3589-4dcf-a4b6-b99fb9cf2a85","Type":"ContainerStarted","Data":"d43c2282f969738cec3a2da117cba5ba3dc12dd5e0d8efd5004960e03bca8936"} Apr 16 18:41:17.177858 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.177742 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:17.180182 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.180158 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bz65j_e33c04f3-e414-4174-8047-0f84ece6cd5d/kube-rbac-proxy/0.log" Apr 16 18:41:17.198372 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.198328 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" podStartSLOduration=1.198311756 podStartE2EDuration="1.198311756s" podCreationTimestamp="2026-04-16 18:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:17.197681235 +0000 UTC m=+1488.283130626" watchObservedRunningTime="2026-04-16 18:41:17.198311756 +0000 UTC m=+1488.283761148" Apr 16 18:41:17.319285 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.319256 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8spcs_1ac43e44-6bc5-4678-9022-029aed19a8c1/dns-node-resolver/0.log" Apr 16 18:41:17.779211 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.779155 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6994467cc7-df69f_e2f036a5-fee5-49c7-9a46-fd0ccba53895/registry/0.log" Apr 16 18:41:17.799168 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:17.799142 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5m9b6_d4f7966d-78bf-4cbb-a764-b066fe69e484/node-ca/0.log" Apr 16 18:41:18.878742 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:18.878710 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-646jh_cea240e4-3f15-45d5-a754-105ae5e43a47/serve-healthcheck-canary/0.log" Apr 16 18:41:19.371558 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:19.371522 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dn8ld_7b335442-4810-4f3b-a541-31865a746c8b/kube-rbac-proxy/0.log" Apr 16 18:41:19.390397 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:19.390361 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dn8ld_7b335442-4810-4f3b-a541-31865a746c8b/exporter/0.log" Apr 16 18:41:19.410884 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:19.410856 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dn8ld_7b335442-4810-4f3b-a541-31865a746c8b/extractor/0.log" Apr 16 18:41:21.552408 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:21.552376 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-z9f8t_8819646c-6541-4ede-a666-69bdf0884ce6/s3-init/0.log" Apr 16 18:41:23.191261 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:23.191232 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-ksjnt" Apr 16 18:41:25.324702 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:25.324666 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-6wlxw_075eb149-d2bc-4930-aa0f-185e8cc92d22/migrator/0.log" Apr 16 18:41:25.350393 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:25.350364 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-6wlxw_075eb149-d2bc-4930-aa0f-185e8cc92d22/graceful-termination/0.log" Apr 16 18:41:25.705036 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:25.705003 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-dwl9r_ac44aa61-86ed-41f4-aab0-bbabab9224b1/kube-storage-version-migrator-operator/1.log" Apr 16 18:41:25.706707 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:25.706680 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-dwl9r_ac44aa61-86ed-41f4-aab0-bbabab9224b1/kube-storage-version-migrator-operator/0.log" Apr 16 18:41:26.745010 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:26.744976 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87852_0f27d004-e5fc-4560-8699-ce203c2bf77e/kube-multus-additional-cni-plugins/0.log" Apr 16 18:41:26.769555 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:26.769516 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87852_0f27d004-e5fc-4560-8699-ce203c2bf77e/egress-router-binary-copy/0.log" Apr 16 18:41:26.790763 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:26.790734 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87852_0f27d004-e5fc-4560-8699-ce203c2bf77e/cni-plugins/0.log" Apr 16 18:41:26.811723 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:26.811698 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87852_0f27d004-e5fc-4560-8699-ce203c2bf77e/bond-cni-plugin/0.log" Apr 16 18:41:26.841551 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:26.841526 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87852_0f27d004-e5fc-4560-8699-ce203c2bf77e/routeoverride-cni/0.log" Apr 16 18:41:26.863620 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:26.863592 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87852_0f27d004-e5fc-4560-8699-ce203c2bf77e/whereabouts-cni-bincopy/0.log" Apr 16 18:41:26.884955 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:26.884912 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87852_0f27d004-e5fc-4560-8699-ce203c2bf77e/whereabouts-cni/0.log" Apr 16 18:41:27.270422 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:27.270389 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rbhdn_d37887a7-2697-430c-834c-76614ddbb9b9/kube-multus/0.log" Apr 16 18:41:27.291999 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:27.291962 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4vgjf_7682219f-20c8-40ee-a84d-c68d79df1dd8/network-metrics-daemon/0.log" Apr 16 18:41:27.315967 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:27.315934 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4vgjf_7682219f-20c8-40ee-a84d-c68d79df1dd8/kube-rbac-proxy/0.log" Apr 16 18:41:28.749048 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:28.749014 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-controller/0.log" Apr 16 18:41:28.767820 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:28.767785 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:41:28.783431 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:28.783402 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/1.log" Apr 16 18:41:28.806965 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:28.806941 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/kube-rbac-proxy-node/0.log" Apr 16 18:41:28.833772 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:28.833741 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:41:28.851060 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:28.851028 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/northd/0.log" Apr 16 18:41:28.874144 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:28.874110 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/nbdb/0.log" Apr 16 18:41:28.903006 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:28.902970 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/sbdb/0.log" Apr 16 18:41:29.019803 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:29.019727 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovnkube-controller/0.log" Apr 16 18:41:29.502645 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:29.502621 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:41:29.504706 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:29.504683 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tttq4_7b023a24-7ef2-470d-8cd5-90366b171323/ovn-acl-logging/0.log" Apr 16 18:41:30.115240 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:30.115207 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7r5l5_167f8a10-92f4-444e-912f-415dafc03e58/network-check-target-container/0.log" Apr 16 18:41:31.175386 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:31.175357 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tl27p_2e16d78d-6a61-4210-b4d5-ecf12d2038ca/iptables-alerter/0.log" Apr 16 18:41:31.845980 ip-10-0-128-68 kubenswrapper[2562]: I0416 18:41:31.845949 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zww8n_64568791-94bd-49ea-adf9-c39f5c4c8f08/tuned/0.log"