Apr 17 11:16:09.961672 ip-10-0-142-114 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:10.399049 ip-10-0-142-114 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:10.399049 ip-10-0-142-114 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:10.399049 ip-10-0-142-114 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:10.399049 ip-10-0-142-114 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:10.399049 ip-10-0-142-114 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:10.400166 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.399910 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:10.404690 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404674 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.404690 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404689 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404693 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404697 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404700 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404704 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404706 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404709 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404712 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404715 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404718 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404720 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404723 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404725 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404728 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404731 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404734 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404737 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404740 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404743 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404746 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.404752 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404748 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404751 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404754 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404757 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404760 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404763 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404766 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404769 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404771 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404774 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404776 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404779 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404782 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404785 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404788 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404790 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404793 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404795 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404798 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404801 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.405230 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404803 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404806 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404809 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404811 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404814 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404817 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404821 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404825 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404828 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404830 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404833 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404835 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404838 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404840 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404844 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404846 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404849 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404852 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404855 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.405776 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404857 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404860 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404862 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404865 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404868 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404871 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404873 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404876 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404878 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404880 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404883 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404885 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404889 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404891 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404896 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404899 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404903 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404906 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404909 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.406246 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404912 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404915 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404918 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404921 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404924 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404927 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.404929 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405299 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405304 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405307 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405311 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405314 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405317 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405320 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405323 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405326 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405328 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405332 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405334 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405337 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.406799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405340 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405342 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405344 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405347 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405349 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405352 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405354 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405357 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405359 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405362 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405365 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405368 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405371 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405373 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405376 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405378 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405397 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405400 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405403 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405405 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.407295 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405409 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405411 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405414 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405417 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405419 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405422 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405425 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405428 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405432 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405435 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405439 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405442 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405445 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405447 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405450 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405453 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405455 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405458 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405461 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405464 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.407810 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405466 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405468 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405471 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405473 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405476 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405478 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405493 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405497 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405500 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405503 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405506 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405510 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405514 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405518 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405521 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405524 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405526 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405529 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405532 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405535 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.408335 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405538 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405540 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405543 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405546 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405549 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405551 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405554 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405556 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405559 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405561 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405564 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405567 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.405569 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405635 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405642 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405649 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405654 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405658 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405662 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405667 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:10.408832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405671 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405674 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405677 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405680 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405684 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405687 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405690 2568 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405693 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405696 2568 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405699 2568 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405702 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405705 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405709 2568 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405712 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405715 2568 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405718 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405728 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405732 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405735 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405738 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405742 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405748 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405751 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405754 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405757 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:10.409315 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405760 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405764 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405767 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405770 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405773 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405776 2568 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405779 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405784 2568 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405787 2568 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405790 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405794 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405797 2568 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405801 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405804 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405807 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405810 2568 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405813 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405816 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405819 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405822 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405824 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405827 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405830 2568 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405833 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405837 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:10.409932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405840 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405844 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405846 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405851 2568 flags.go:64] FLAG: --help="false" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405854 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405857 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405860 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405863 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405866 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405870 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405872 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405875 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405878 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405881 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405884 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405887 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405890 2568 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405894 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405897 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405900 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405903 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405906 2568 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405909 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405912 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:10.410553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405915 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405920 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405923 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405926 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405929 2568 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405932 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405935 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405938 2568 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405941 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405946 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405949 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405954 2568 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405957 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405961 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405964 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405967 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405970 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405973 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405975 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405983 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405985 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405995 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.405999 2568 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:10.411115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406002 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406008 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406013 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406016 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406019 2568 flags.go:64] FLAG: --port="10250" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406022 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406024 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06b49113ac049944e" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406027 2568 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406030 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406033 2568 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406036 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406039 2568 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406043 2568 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406045 2568 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406048 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406051 2568 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406060 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406063 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406066 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406069 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406072 2568 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406077 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406080 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406083 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406085 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406088 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:10.411697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406091 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406094 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406097 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406100 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406103 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406106 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406109 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406112 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406117 2568 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406120 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406126 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406129 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406131 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406135 2568 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406138 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406140 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406143 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406146 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406149 2568 flags.go:64] FLAG: --v="2" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406153 2568 flags.go:64] FLAG: --version="false" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406157 2568 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406161 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.406164 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406258 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.412301 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406262 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406272 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406275 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406280 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406283 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406286 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406288 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406291 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406293 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406296 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406299 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406301 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406304 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406307 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406309 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406312 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406315 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406318 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406320 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406323 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.412919 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406325 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406330 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406333 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406335 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406338 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406341 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406344 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406346 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406348 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406351 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406354 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406358 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406361 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406364 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406372 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406376 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406391 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406394 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406397 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.413506 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406416 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406421 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406424 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406427 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406430 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406432 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406435 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406438 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406440 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406447 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406449 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406452 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406455 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406457 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406460 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406462 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406465 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406468 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406470 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406473 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.413984 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406476 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406478 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406481 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406483 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406486 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406488 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406491 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406493 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406498 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406501 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406503 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406506 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406509 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406512 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406514 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406516 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406519 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406522 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406524 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.414513 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406526 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.414973 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406529 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.414973 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406533 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.414973 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406535 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.414973 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406538 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.414973 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406541 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.414973 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.406543 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.414973 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.407399 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:10.417936 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.417917 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:10.417977 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.417936 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:10.418013 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418005 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.418013 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418010 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.418013 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418014 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418018 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418021 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418024 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418028 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418033 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418036 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418039 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418042 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418051 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418054 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418056 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418059 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418061 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418064 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418067 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418069 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418072 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418075 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.418093 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418077 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418080 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418082 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418085 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418087 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418091 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418093 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418097 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418100 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418103 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418106 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418108 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418111 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418113 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418116 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418119 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418121 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418124 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418126 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.418577 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418129 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418131 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418134 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418136 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418144 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418147 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418150 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418152 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418155 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418157 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418159 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418162 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418165 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418167 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418170 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418173 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418175 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418178 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418181 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418184 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.419079 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418186 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418189 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418191 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418195 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418198 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418201 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418204 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418206 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418209 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418211 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418214 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418216 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418219 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418221 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418224 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418226 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418228 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418237 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418240 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418242 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.419663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418245 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418248 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418250 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418253 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418256 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418258 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.418264 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418411 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418416 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418419 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418423 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418425 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418428 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418431 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418433 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418436 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.420254 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418438 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418441 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418444 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418446 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418449 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418451 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418454 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418457 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418461 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418464 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418466 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418469 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418471 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418474 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418482 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418485 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418487 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418497 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418501 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.420689 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418504 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418507 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418510 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418513 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418517 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418520 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418523 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418526 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418528 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418531 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418533 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418536 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418538 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418541 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418543 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418546 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418548 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418551 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418554 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418556 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.421149 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418559 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418561 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418564 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418566 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418569 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418571 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418574 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418576 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418592 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418595 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418597 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418600 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418602 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418605 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418608 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418610 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418613 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418616 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418618 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418621 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.421721 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418623 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418626 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418628 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418631 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418634 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418637 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418639 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418642 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418644 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418647 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418649 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418652 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418654 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418657 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418659 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418662 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418664 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.422224 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:10.418667 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.422648 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.418672 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:10.422648 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.420446 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:10.423450 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.423437 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:10.424438 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.424426 2568 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:10.424532 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.424519 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:10.424566 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.424555 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:10.452142 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.452121 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:10.455065 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.455048 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:10.470771 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.470750 2568 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:10.476098 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.476082 2568 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:10.478014 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.477999 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:10.481837 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.481816 2568 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a452a24a-c6cb-4e4b-a992-5963c6fc9fac:/dev/nvme0n1p3 da103994-3602-4b5b-929d-7ead241013ac:/dev/nvme0n1p4] Apr 17 11:16:10.481909 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.481836 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:10.484620 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.484600 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:10.487647 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.487529 2568 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:10.485500026 +0000 UTC m=+0.408354834 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100266 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23826d7ae075cda56dc3a422f83aaa SystemUUID:ec23826d-7ae0-75cd-a56d-c3a422f83aaa BootID:cebecf73-3e8a-4add-89b8-0c3a2d718bb2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9c:88:a0:f0:0f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9c:88:a0:f0:0f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:bc:ac:8c:a3:67 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:10.487647 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.487640 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:10.487804 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.487786 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:10.488229 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.488094 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:10.488400 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.488230 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-114.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:10.488476 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.488416 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:10.488476 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.488428 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:10.488476 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.488448 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:10.489269 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.489257 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:10.491066 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.491052 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:10.491293 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.491268 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:10.495198 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.495182 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:10.495198 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.495200 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:10.495291 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.495212 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:10.495291 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.495223 2568 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:10.495291 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.495231 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:10.496311 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.496300 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:10.496348 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.496318 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:10.500639 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.500624 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:10.501889 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.501876 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:10.504660 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504646 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:10.504660 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504663 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504670 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504675 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504680 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504686 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504697 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504702 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504710 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504716 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504724 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:10.504800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.504733 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:10.505915 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.505905 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:10.505915 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.505916 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:10.506430 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.506407 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-114.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:10.506490 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.506421 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:10.509352 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.509340 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:10.509399 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.509373 2568 server.go:1295] "Started kubelet" Apr 17 11:16:10.509466 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.509444 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:10.509512 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.509470 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:10.509558 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.509533 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:10.510184 ip-10-0-142-114 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:10.510842 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.510826 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:10.511951 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.511934 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:10.515992 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.515974 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:10.516659 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.516639 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:10.517495 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.517475 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:10.517614 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.517603 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:10.517685 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.517678 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:10.517888 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.517878 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:10.517969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.517961 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:10.518039 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.517881 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:10.519331 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.519311 2568 factory.go:55] Registering systemd factory Apr 17 11:16:10.519584 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.519372 2568 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:10.519701 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.519686 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-114.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:10.519750 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.519693 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-114.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 11:16:10.519792 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.519748 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 11:16:10.520419 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.520399 2568 factory.go:153] Registering CRI-O factory Apr 17 11:16:10.520419 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.520418 2568 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:10.520697 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.519745 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-114.ec2.internal.18a720bc277c5c61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-114.ec2.internal,UID:ip-10-0-142-114.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-114.ec2.internal,},FirstTimestamp:2026-04-17 11:16:10.509352033 +0000 UTC m=+0.432206841,LastTimestamp:2026-04-17 11:16:10.509352033 +0000 UTC m=+0.432206841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-114.ec2.internal,}" Apr 17 11:16:10.521241 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.521224 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:10.521321 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.521260 2568 factory.go:103] Registering Raw factory Apr 17 11:16:10.521321 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.521277 2568 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:10.521701 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.521685 2568 manager.go:319] Starting recovery of all containers Apr 17 11:16:10.521786 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.521746 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:10.529254 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.529217 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:10.532562 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.532544 2568 manager.go:324] Recovery completed Apr 17 11:16:10.536611 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.536598 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:10.538774 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.538758 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:10.538825 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.538784 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:10.538825 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.538796 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:10.539217 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.539202 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:10.539263 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.539216 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:10.539263 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.539231 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:10.541149 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.541089 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-114.ec2.internal.18a720bc293d4850 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-114.ec2.internal,UID:ip-10-0-142-114.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-114.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-114.ec2.internal,},FirstTimestamp:2026-04-17 11:16:10.53877256 +0000 UTC m=+0.461627369,LastTimestamp:2026-04-17 11:16:10.53877256 +0000 UTC m=+0.461627369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-114.ec2.internal,}" Apr 17 11:16:10.541812 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.541799 2568 policy_none.go:49] "None policy: Start" Apr 17 11:16:10.541849 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.541825 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:10.541849 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.541835 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:10.552152 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.552087 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-114.ec2.internal.18a720bc293d88e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-114.ec2.internal,UID:ip-10-0-142-114.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-142-114.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-142-114.ec2.internal,},FirstTimestamp:2026-04-17 11:16:10.538789093 +0000 UTC m=+0.461643902,LastTimestamp:2026-04-17 11:16:10.538789093 +0000 UTC m=+0.461643902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-114.ec2.internal,}" Apr 17 11:16:10.555685 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.555671 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wm8s2" Apr 17 11:16:10.562520 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.562502 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wm8s2" Apr 17 11:16:10.563959 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.563878 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-114.ec2.internal.18a720bc293db3f9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-114.ec2.internal,UID:ip-10-0-142-114.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-142-114.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-142-114.ec2.internal,},FirstTimestamp:2026-04-17 11:16:10.538800121 +0000 UTC m=+0.461654930,LastTimestamp:2026-04-17 11:16:10.538800121 +0000 UTC m=+0.461654930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-114.ec2.internal,}" Apr 17 11:16:10.577897 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.577879 2568 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:10.577973 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.577915 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:10.577973 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.577924 2568 server.go:85] "Starting device plugin registration server" Apr 17 11:16:10.578162 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.578150 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:10.578223 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.578163 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:10.578268 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.578257 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:10.578531 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.578323 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:10.578531 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.578339 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:10.578928 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.578912 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:10.578988 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.578951 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:10.678750 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.678694 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:10.679977 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.679528 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:10.679977 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.679553 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:10.679977 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.679564 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:10.679977 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.679590 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.685761 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.685745 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.685838 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.685767 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-114.ec2.internal\": node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:10.698893 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.698873 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:10.714783 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.714766 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:10.714865 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.714794 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:10.714865 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.714812 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:10.714865 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.714818 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:10.714865 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.714845 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:10.717171 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.717153 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:10.799468 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.799434 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:10.815606 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.815569 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal"] Apr 17 11:16:10.815668 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.815662 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:10.817678 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.817655 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:10.817678 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.817682 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:10.817825 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.817692 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:10.818811 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.818796 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:10.818896 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.818821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc8e7800d795086fdca1d6407793ef9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal\" (UID: \"bfc8e7800d795086fdca1d6407793ef9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.818896 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.818855 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc8e7800d795086fdca1d6407793ef9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal\" (UID: \"bfc8e7800d795086fdca1d6407793ef9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.818994 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.818947 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.818994 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.818974 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:10.819603 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.819572 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:10.819603 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.819581 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:10.819603 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.819600 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:10.819746 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.819612 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:10.819746 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.819617 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:10.819746 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.819628 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:10.820890 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.820877 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.820930 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.820904 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:10.822275 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.822260 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:10.822370 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.822284 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:10.822370 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.822320 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:10.848933 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.848916 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-114.ec2.internal\" not found" node="ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.853131 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.853115 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-114.ec2.internal\" not found" node="ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.899717 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:10.899687 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:10.918994 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.918975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc8e7800d795086fdca1d6407793ef9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal\" (UID: \"bfc8e7800d795086fdca1d6407793ef9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.919058 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.919002 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc8e7800d795086fdca1d6407793ef9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal\" (UID: \"bfc8e7800d795086fdca1d6407793ef9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.919058 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.919028 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12581cfe4fc807f31862128cb3a75bcb-config\") pod \"kube-apiserver-proxy-ip-10-0-142-114.ec2.internal\" (UID: \"12581cfe4fc807f31862128cb3a75bcb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.919058 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.919056 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc8e7800d795086fdca1d6407793ef9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal\" (UID: \"bfc8e7800d795086fdca1d6407793ef9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:10.919161 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:10.919099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc8e7800d795086fdca1d6407793ef9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal\" (UID: \"bfc8e7800d795086fdca1d6407793ef9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:11.000444 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:11.000371 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:11.019731 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.019707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12581cfe4fc807f31862128cb3a75bcb-config\") pod \"kube-apiserver-proxy-ip-10-0-142-114.ec2.internal\" (UID: \"12581cfe4fc807f31862128cb3a75bcb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" Apr 17 11:16:11.019810 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.019747 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12581cfe4fc807f31862128cb3a75bcb-config\") pod \"kube-apiserver-proxy-ip-10-0-142-114.ec2.internal\" (UID: \"12581cfe4fc807f31862128cb3a75bcb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" Apr 17 11:16:11.100881 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:11.100848 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:11.151329 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.151303 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:11.155977 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.155952 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" Apr 17 11:16:11.201142 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:11.201115 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:11.301699 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:11.301646 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:11.402103 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:11.402076 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:11.424590 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.424554 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:11.424706 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.424692 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:11.503154 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:11.503126 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:11.516248 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.516230 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:11.522009 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.521983 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:11.536309 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.536291 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:11.565300 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.565241 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:10 +0000 UTC" deadline="2027-12-25 20:36:32.43764622 +0000 UTC" Apr 17 11:16:11.565300 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.565268 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14817h20m20.872381161s" Apr 17 11:16:11.570932 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.570913 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5dsls" Apr 17 11:16:11.577903 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.577885 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5dsls" Apr 17 11:16:11.603592 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:11.603572 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-114.ec2.internal\" not found" Apr 17 11:16:11.607069 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.607054 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:11.617507 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.617490 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" Apr 17 11:16:11.629326 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.629308 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:11.631447 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.631428 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" Apr 17 11:16:11.639181 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:11.639157 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfc8e7800d795086fdca1d6407793ef9.slice/crio-13d953ebcf166966b14bb80e2f74cc455caf933ed4247614bae6da4ee7dd42ef WatchSource:0}: Error finding container 13d953ebcf166966b14bb80e2f74cc455caf933ed4247614bae6da4ee7dd42ef: Status 404 returned error can't find the container with id 13d953ebcf166966b14bb80e2f74cc455caf933ed4247614bae6da4ee7dd42ef Apr 17 11:16:11.639533 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:11.639505 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12581cfe4fc807f31862128cb3a75bcb.slice/crio-9f8ffa33add139ebfeeed7523651269107b51122b705b4f719d04be228cab29a WatchSource:0}: Error finding container 9f8ffa33add139ebfeeed7523651269107b51122b705b4f719d04be228cab29a: Status 404 returned error can't find the container with id 9f8ffa33add139ebfeeed7523651269107b51122b705b4f719d04be228cab29a Apr 17 11:16:11.643836 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.643820 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:11.644962 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.644950 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:11.717793 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.717753 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" event={"ID":"12581cfe4fc807f31862128cb3a75bcb","Type":"ContainerStarted","Data":"9f8ffa33add139ebfeeed7523651269107b51122b705b4f719d04be228cab29a"} Apr 17 11:16:11.718646 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.718624 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" event={"ID":"bfc8e7800d795086fdca1d6407793ef9","Type":"ContainerStarted","Data":"13d953ebcf166966b14bb80e2f74cc455caf933ed4247614bae6da4ee7dd42ef"} Apr 17 11:16:11.728168 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:11.728151 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:12.427077 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.427051 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:12.496535 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.496502 2568 apiserver.go:52] "Watching apiserver" Apr 17 11:16:12.507796 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.507767 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:12.509808 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.509786 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-rz67d","openshift-ovn-kubernetes/ovnkube-node-sps6r","kube-system/konnectivity-agent-ndrlj","kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w","openshift-image-registry/node-ca-kgk95","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal","openshift-multus/multus-additional-cni-plugins-nw5h9","openshift-multus/network-metrics-daemon-4d5rw","openshift-network-diagnostics/network-check-target-srg6p","kube-system/global-pull-secret-syncer-tvkdx","openshift-cluster-node-tuning-operator/tuned-j7lpw","openshift-dns/node-resolver-h7hkk","openshift-multus/multus-4mt24"] Apr 17 11:16:12.511869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.511844 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.514144 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.514124 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.514927 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.514833 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:12.514927 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.514876 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:12.514927 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.514895 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:12.515165 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.514905 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jm5kp\"" Apr 17 11:16:12.515165 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.514974 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:12.515165 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.515142 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:12.515165 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.515156 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:12.516360 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.516333 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.516800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.516691 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:12.517169 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.517148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-f2dq9\"" Apr 17 11:16:12.517557 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.517537 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.517679 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.517662 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xdwq8\"" Apr 17 11:16:12.518059 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.518043 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:12.518368 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.518340 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:12.520324 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.520298 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.521209 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.521187 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:12.521311 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.521295 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:12.522076 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.521628 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:12.522076 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.521675 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:12.522076 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.521791 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5fpwr\"" Apr 17 11:16:12.522076 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.521924 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:12.522076 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.521955 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:12.522359 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.522167 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:12.522359 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.521953 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9md9h\"" Apr 17 11:16:12.522985 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.522701 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:12.522985 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.522772 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vzb99\"" Apr 17 11:16:12.522985 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.522785 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:12.522985 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.522930 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:12.523671 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.523643 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.524065 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.523853 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:12.524065 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.523970 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:12.525958 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.525938 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:12.526203 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.526186 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.526330 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.526262 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:12.526979 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.526961 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-cq9wg\"" Apr 17 11:16:12.527378 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.527358 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.527635 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.527528 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:12.527635 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.527532 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:12.528802 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.528784 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.529294 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-lib-modules\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.529447 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529429 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-host\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.529570 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529551 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-hostroot\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.529670 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529581 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-conf-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.529670 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529602 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-sys\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.529670 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gbw5\" (UniqueName: \"kubernetes.io/projected/52957f49-ecc1-4e2d-9165-8b136f11b311-kube-api-access-8gbw5\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.529670 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529650 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-cni-binary-copy\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529674 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f-konnectivity-ca\") pod \"konnectivity-agent-ndrlj\" (UID: \"255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f\") " pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529721 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-host\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529765 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-csxvt\"" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529776 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529802 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j657\" (UniqueName: \"kubernetes.io/projected/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-kube-api-access-2j657\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529814 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529839 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tbgk\" (UniqueName: \"kubernetes.io/projected/a995cba3-0edd-41aa-923f-d47b9d050676-kube-api-access-7tbgk\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.529869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-device-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529911 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-etc-selinux\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529941 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.529967 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n7w9\" (UniqueName: \"kubernetes.io/projected/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-kube-api-access-7n7w9\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530000 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-cni-multus\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530023 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-multus-certs\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530048 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-etc-kubernetes\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530067 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysctl-conf\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530082 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-tuned\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530100 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-system-cni-dir\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-socket-dir-parent\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530157 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-modprobe-d\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysconfig\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530221 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-kubernetes\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530242 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-run\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530285 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b1b3449-3e5b-448f-a69c-f6678b42b96b-cni-binary-copy\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-os-release\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-k8s-cni-cncf-io\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530349 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-netns\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530371 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-kubelet\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530434 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-systemd\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530465 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52957f49-ecc1-4e2d-9165-8b136f11b311-tmp\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530492 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530518 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-socket-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530542 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-daemon-config\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530565 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gxw6\" (UniqueName: \"kubernetes.io/projected/9b1b3449-3e5b-448f-a69c-f6678b42b96b-kube-api-access-9gxw6\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-var-lib-kubelet\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530648 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-cnibin\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530676 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530704 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f-agent-certs\") pod \"konnectivity-agent-ndrlj\" (UID: \"255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f\") " pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysctl-d\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530757 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-serviceca\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.530898 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530804 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-registration-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-sys-fs\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530853 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdx46\" (UniqueName: \"kubernetes.io/projected/ab330144-ee92-4afb-ba55-21d109f563b6-kube-api-access-wdx46\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530876 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-cni-bin\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-system-cni-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530926 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-cni-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-cnibin\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.530983 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-os-release\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.531074 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.531217 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:12.531627 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.531537 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:12.532744 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.532706 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tn8wx\"" Apr 17 11:16:12.532834 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.532802 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:12.532899 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.532864 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:12.532899 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.532883 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:12.579543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.579518 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:11 +0000 UTC" deadline="2027-11-21 11:34:12.133730732 +0000 UTC" Apr 17 11:16:12.579543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.579544 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13992h17m59.554190747s" Apr 17 11:16:12.618916 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.618723 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:12.632013 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.631984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-socket-dir-parent\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.632167 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-kubernetes\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.632167 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632053 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.632167 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632078 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-netns\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.632167 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632101 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-kubelet\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.632167 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632118 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-socket-dir-parent\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.632167 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632118 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-kubernetes\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.632167 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52957f49-ecc1-4e2d-9165-8b136f11b311-tmp\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632184 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-socket-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632211 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-kubelet\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632218 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632230 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-netns\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632243 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovnkube-script-lib\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632296 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-daemon-config\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632301 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-socket-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632321 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gxw6\" (UniqueName: \"kubernetes.io/projected/9b1b3449-3e5b-448f-a69c-f6678b42b96b-kube-api-access-9gxw6\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632448 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-cnibin\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.632523 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632502 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-kubelet\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovn-node-metrics-cert\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632538 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-cnibin\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632587 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysctl-d\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632595 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632628 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-serviceca\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-registration-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-sys-fs\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdx46\" (UniqueName: \"kubernetes.io/projected/ab330144-ee92-4afb-ba55-21d109f563b6-kube-api-access-wdx46\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-registration-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-sys-fs\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632859 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysctl-d\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632887 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2wpt\" (UniqueName: \"kubernetes.io/projected/c8d89149-a1c2-4e87-941b-ce08710499d4-kube-api-access-n2wpt\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.632997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-var-lib-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.633033 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633037 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-cni-binary-copy\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633048 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-daemon-config\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-system-cni-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633081 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-cnibin\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-lib-modules\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-cnibin\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633052 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-serviceca\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633146 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-host\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-system-cni-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633526 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-hosts-file\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633565 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c8d89149-a1c2-4e87-941b-ce08710499d4-iptables-alerter-script\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.633625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-hostroot\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633647 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-lib-modules\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-conf-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-sys\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gbw5\" (UniqueName: \"kubernetes.io/projected/52957f49-ecc1-4e2d-9165-8b136f11b311-kube-api-access-8gbw5\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f-konnectivity-ca\") pod \"konnectivity-agent-ndrlj\" (UID: \"255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f\") " pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633886 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-device-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633915 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-etc-selinux\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.633946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:12.634182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634117 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-cni-binary-copy\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.634567 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634288 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-host\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.634567 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j657\" (UniqueName: \"kubernetes.io/projected/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-kube-api-access-2j657\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.634567 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tbgk\" (UniqueName: \"kubernetes.io/projected/a995cba3-0edd-41aa-923f-d47b9d050676-kube-api-access-7tbgk\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.634567 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.634413 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:12.634567 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634430 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-run-netns\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.634567 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-cni-netd\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.634826 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.634591 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.134560992 +0000 UTC m=+3.057415811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:12.634826 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-env-overrides\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.634929 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634826 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-hostroot\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.634929 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-cni-multus\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.635017 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634937 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-conf-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.635017 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-multus-certs\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.635017 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.634989 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-etc-kubernetes\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.635135 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635027 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysctl-conf\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.635135 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635065 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-tuned\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.635193 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635123 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-systemd-units\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.635193 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-systemd\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.635251 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635224 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-cni-bin\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.635251 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635227 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-host\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.635309 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-modprobe-d\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.635364 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635305 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysconfig\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.635364 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-run\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.635471 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635399 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b1b3449-3e5b-448f-a69c-f6678b42b96b-cni-binary-copy\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.635471 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635464 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-os-release\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.635553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635504 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-cni-multus\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.635553 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635496 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-dbus\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.635638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635566 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-multus-certs\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.635638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635564 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-ovn\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.635638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f-konnectivity-ca\") pod \"konnectivity-agent-ndrlj\" (UID: \"255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f\") " pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:12.635638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-etc-kubernetes\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.635638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635629 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovnkube-config\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.635840 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635674 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysconfig\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.635840 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635696 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-run\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.635840 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635707 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-etc-selinux\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.635840 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-device-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.635840 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635463 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-sysctl-conf\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.636046 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-os-release\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.636046 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-k8s-cni-cncf-io\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.636046 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-systemd\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.636046 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635916 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-log-socket\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.636046 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635948 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-run-ovn-kubernetes\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.636046 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.635977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-modprobe-d\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.636046 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636008 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-var-lib-kubelet\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.636338 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636045 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-systemd\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.636338 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.636338 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-run-k8s-cni-cncf-io\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.636338 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636168 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-var-lib-kubelet\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.636542 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52957f49-ecc1-4e2d-9165-8b136f11b311-tmp\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.636600 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636167 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f-agent-certs\") pod \"konnectivity-agent-ndrlj\" (UID: \"255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f\") " pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:12.636648 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636623 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8d89149-a1c2-4e87-941b-ce08710499d4-host-slash\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.636699 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.636745 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636706 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-etc-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.636789 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636750 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzmz7\" (UniqueName: \"kubernetes.io/projected/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-kube-api-access-wzmz7\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.636832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.636879 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636839 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-cni-bin\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.636879 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636869 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-cni-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.636962 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636902 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-os-release\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.636962 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636908 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.636962 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636934 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:12.637091 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.636966 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-slash\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.637091 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-node-log\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.637091 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637015 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-sys\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.637091 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.637257 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.637257 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637106 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-host-var-lib-cni-bin\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.637257 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637134 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7w9\" (UniqueName: \"kubernetes.io/projected/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-kube-api-access-7n7w9\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:12.637257 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-tmp-dir\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.637257 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637214 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-kubelet-config\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.637487 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm4mt\" (UniqueName: \"kubernetes.io/projected/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-kube-api-access-sm4mt\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.637487 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637373 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-os-release\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.637487 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637420 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-system-cni-dir\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.637487 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637461 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52957f49-ecc1-4e2d-9165-8b136f11b311-host\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.637666 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637550 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a995cba3-0edd-41aa-923f-d47b9d050676-system-cni-dir\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.637906 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.637884 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab330144-ee92-4afb-ba55-21d109f563b6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.638304 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.638283 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b1b3449-3e5b-448f-a69c-f6678b42b96b-multus-cni-dir\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.638945 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.638925 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b1b3449-3e5b-448f-a69c-f6678b42b96b-cni-binary-copy\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.640518 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.640495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f-agent-certs\") pod \"konnectivity-agent-ndrlj\" (UID: \"255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f\") " pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:12.640586 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.640549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/52957f49-ecc1-4e2d-9165-8b136f11b311-etc-tuned\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.641828 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.641809 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a995cba3-0edd-41aa-923f-d47b9d050676-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.645793 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.645759 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j657\" (UniqueName: \"kubernetes.io/projected/0ebdddb3-e6b6-4191-9db1-01e8d15cae25-kube-api-access-2j657\") pod \"node-ca-kgk95\" (UID: \"0ebdddb3-e6b6-4191-9db1-01e8d15cae25\") " pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.646262 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.646201 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tbgk\" (UniqueName: \"kubernetes.io/projected/a995cba3-0edd-41aa-923f-d47b9d050676-kube-api-access-7tbgk\") pod \"multus-additional-cni-plugins-nw5h9\" (UID: \"a995cba3-0edd-41aa-923f-d47b9d050676\") " pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.646505 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.646453 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdx46\" (UniqueName: \"kubernetes.io/projected/ab330144-ee92-4afb-ba55-21d109f563b6-kube-api-access-wdx46\") pod \"aws-ebs-csi-driver-node-qs97w\" (UID: \"ab330144-ee92-4afb-ba55-21d109f563b6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.646598 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.646547 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gxw6\" (UniqueName: \"kubernetes.io/projected/9b1b3449-3e5b-448f-a69c-f6678b42b96b-kube-api-access-9gxw6\") pod \"multus-4mt24\" (UID: \"9b1b3449-3e5b-448f-a69c-f6678b42b96b\") " pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.647715 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.647694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n7w9\" (UniqueName: \"kubernetes.io/projected/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-kube-api-access-7n7w9\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:12.648148 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.648129 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gbw5\" (UniqueName: \"kubernetes.io/projected/52957f49-ecc1-4e2d-9165-8b136f11b311-kube-api-access-8gbw5\") pod \"tuned-j7lpw\" (UID: \"52957f49-ecc1-4e2d-9165-8b136f11b311\") " pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.738427 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8d89149-a1c2-4e87-941b-ce08710499d4-host-slash\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.738427 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.738427 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-etc-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.738427 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzmz7\" (UniqueName: \"kubernetes.io/projected/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-kube-api-access-wzmz7\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8d89149-a1c2-4e87-941b-ce08710499d4-host-slash\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.738505 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.738568 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret podName:4f77cfcb-60a1-4c91-8f58-dac82efa3fe4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.238548039 +0000 UTC m=+3.161402838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret") pod "global-pull-secret-syncer-tvkdx" (UID: "4f77cfcb-60a1-4c91-8f58-dac82efa3fe4") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738583 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-etc-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738663 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738686 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-slash\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738687 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.738723 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738704 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-node-log\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-tmp-dir\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738762 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-node-log\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738779 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-kubelet-config\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738803 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-slash\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738839 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sm4mt\" (UniqueName: \"kubernetes.io/projected/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-kube-api-access-sm4mt\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738904 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovnkube-script-lib\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738930 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-kubelet\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738953 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovn-node-metrics-cert\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738976 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2wpt\" (UniqueName: \"kubernetes.io/projected/c8d89149-a1c2-4e87-941b-ce08710499d4-kube-api-access-n2wpt\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.738930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-kubelet-config\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739001 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-var-lib-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739110 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-kubelet\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739126 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-tmp-dir\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739154 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-var-lib-openvswitch\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739217 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-hosts-file\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c8d89149-a1c2-4e87-941b-ce08710499d4-iptables-alerter-script\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739293 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-run-netns\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739370 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-hosts-file\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-run-netns\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739477 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-cni-netd\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739503 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-env-overrides\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739537 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-systemd-units\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739558 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovnkube-script-lib\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739562 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-systemd\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-cni-netd\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739587 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-cni-bin\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739609 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-systemd-units\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-cni-bin\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739627 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-dbus\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.739758 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739654 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-ovn\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-systemd\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-run-ovn\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovnkube-config\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739752 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-dbus\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-log-socket\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739805 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-run-ovn-kubernetes\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739854 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c8d89149-a1c2-4e87-941b-ce08710499d4-iptables-alerter-script\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739864 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-host-run-ovn-kubernetes\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-log-socket\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.739942 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-env-overrides\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.740577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.740164 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovnkube-config\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.742109 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.742082 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-ovn-node-metrics-cert\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.745518 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.745502 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:12.745611 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.745521 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:12.745611 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.745531 2568 projected.go:194] Error preparing data for projected volume kube-api-access-dg5hl for pod openshift-network-diagnostics/network-check-target-srg6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:12.745611 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:12.745576 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl podName:cb867750-66e0-49fa-b347-fa907f29bbae nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.245564046 +0000 UTC m=+3.168418855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dg5hl" (UniqueName: "kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl") pod "network-check-target-srg6p" (UID: "cb867750-66e0-49fa-b347-fa907f29bbae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:12.747685 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.747665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2wpt\" (UniqueName: \"kubernetes.io/projected/c8d89149-a1c2-4e87-941b-ce08710499d4-kube-api-access-n2wpt\") pod \"iptables-alerter-rz67d\" (UID: \"c8d89149-a1c2-4e87-941b-ce08710499d4\") " pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.748559 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.748521 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzmz7\" (UniqueName: \"kubernetes.io/projected/61d4b955-d2fa-4cee-a5a9-5bb37d994e5f-kube-api-access-wzmz7\") pod \"node-resolver-h7hkk\" (UID: \"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f\") " pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.748651 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.748562 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm4mt\" (UniqueName: \"kubernetes.io/projected/f0fa0497-6cc8-4a84-b902-a5b9ad486d28-kube-api-access-sm4mt\") pod \"ovnkube-node-sps6r\" (UID: \"f0fa0497-6cc8-4a84-b902-a5b9ad486d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:12.825339 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.825308 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" Apr 17 11:16:12.839023 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.838996 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4mt24" Apr 17 11:16:12.846742 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.846724 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:12.851294 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.851276 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" Apr 17 11:16:12.856818 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.856795 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" Apr 17 11:16:12.863454 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.863437 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kgk95" Apr 17 11:16:12.870943 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.870928 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rz67d" Apr 17 11:16:12.877456 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.877440 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h7hkk" Apr 17 11:16:12.882985 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:12.882968 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:13.142597 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.142517 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:13.142750 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.142661 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.142750 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.142731 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.142713549 +0000 UTC m=+4.065568346 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.210663 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.210624 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebdddb3_e6b6_4191_9db1_01e8d15cae25.slice/crio-36971daca47961072b485dc14b9190a1f8adc5f522935d48ec6b2f2720561288 WatchSource:0}: Error finding container 36971daca47961072b485dc14b9190a1f8adc5f522935d48ec6b2f2720561288: Status 404 returned error can't find the container with id 36971daca47961072b485dc14b9190a1f8adc5f522935d48ec6b2f2720561288 Apr 17 11:16:13.213036 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.212897 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d4b955_d2fa_4cee_a5a9_5bb37d994e5f.slice/crio-d277b42cf01563534546de4ff5cd4e5734a2b1fa16bc32587c7ba68b2ce17285 WatchSource:0}: Error finding container d277b42cf01563534546de4ff5cd4e5734a2b1fa16bc32587c7ba68b2ce17285: Status 404 returned error can't find the container with id d277b42cf01563534546de4ff5cd4e5734a2b1fa16bc32587c7ba68b2ce17285 Apr 17 11:16:13.216078 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.215850 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1b3449_3e5b_448f_a69c_f6678b42b96b.slice/crio-644dcbbbabff6f61056a9f71ed3a1adbd27286a6ad1439348df49963947db3dc WatchSource:0}: Error finding container 644dcbbbabff6f61056a9f71ed3a1adbd27286a6ad1439348df49963947db3dc: Status 404 returned error can't find the container with id 644dcbbbabff6f61056a9f71ed3a1adbd27286a6ad1439348df49963947db3dc Apr 17 11:16:13.217889 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.217851 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52957f49_ecc1_4e2d_9165_8b136f11b311.slice/crio-c0eb68f7b0bf6a0e1c0b31e4bc16d3695d85727cfd4080fa0346f226da2911dc WatchSource:0}: Error finding container c0eb68f7b0bf6a0e1c0b31e4bc16d3695d85727cfd4080fa0346f226da2911dc: Status 404 returned error can't find the container with id c0eb68f7b0bf6a0e1c0b31e4bc16d3695d85727cfd4080fa0346f226da2911dc Apr 17 11:16:13.218908 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.218887 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab330144_ee92_4afb_ba55_21d109f563b6.slice/crio-1c1f4e727d2d1594399c3291ec77ca66a4b10d8a8b29dad7ce46c92dadb0d639 WatchSource:0}: Error finding container 1c1f4e727d2d1594399c3291ec77ca66a4b10d8a8b29dad7ce46c92dadb0d639: Status 404 returned error can't find the container with id 1c1f4e727d2d1594399c3291ec77ca66a4b10d8a8b29dad7ce46c92dadb0d639 Apr 17 11:16:13.219659 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.219579 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8d89149_a1c2_4e87_941b_ce08710499d4.slice/crio-f87bcccfc4b213f8df6292ee9c0a570883a6bcc4b92f87c62eb5c385c8bc8173 WatchSource:0}: Error finding container f87bcccfc4b213f8df6292ee9c0a570883a6bcc4b92f87c62eb5c385c8bc8173: Status 404 returned error can't find the container with id f87bcccfc4b213f8df6292ee9c0a570883a6bcc4b92f87c62eb5c385c8bc8173 Apr 17 11:16:13.221880 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.221173 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda995cba3_0edd_41aa_923f_d47b9d050676.slice/crio-8edefd1211df899c04f409a614a657347885ae638122508226f74aa63f6605e7 WatchSource:0}: Error finding container 8edefd1211df899c04f409a614a657347885ae638122508226f74aa63f6605e7: Status 404 returned error can't find the container with id 8edefd1211df899c04f409a614a657347885ae638122508226f74aa63f6605e7 Apr 17 11:16:13.221880 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.221491 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0fa0497_6cc8_4a84_b902_a5b9ad486d28.slice/crio-f98960c0b7c958a1eeb10adaae00ca62ac88295772d8c23dcd5bf624c4d1c591 WatchSource:0}: Error finding container f98960c0b7c958a1eeb10adaae00ca62ac88295772d8c23dcd5bf624c4d1c591: Status 404 returned error can't find the container with id f98960c0b7c958a1eeb10adaae00ca62ac88295772d8c23dcd5bf624c4d1c591 Apr 17 11:16:13.223086 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:16:13.223060 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255d82a1_7244_4c1c_ab7b_1ad9c2d49e6f.slice/crio-91e35f6a6e31b022231515a7f9a05c7614df858f5b9223c5c83d69f004c51f5d WatchSource:0}: Error finding container 91e35f6a6e31b022231515a7f9a05c7614df858f5b9223c5c83d69f004c51f5d: Status 404 returned error can't find the container with id 91e35f6a6e31b022231515a7f9a05c7614df858f5b9223c5c83d69f004c51f5d Apr 17 11:16:13.242792 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.242766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:13.242884 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.242873 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:13.242944 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.242917 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret podName:4f77cfcb-60a1-4c91-8f58-dac82efa3fe4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.242906002 +0000 UTC m=+4.165760798 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret") pod "global-pull-secret-syncer-tvkdx" (UID: "4f77cfcb-60a1-4c91-8f58-dac82efa3fe4") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:13.343846 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.343817 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:13.343958 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.343917 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:13.343958 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.343930 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:13.343958 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.343938 2568 projected.go:194] Error preparing data for projected volume kube-api-access-dg5hl for pod openshift-network-diagnostics/network-check-target-srg6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.344071 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.343980 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl podName:cb867750-66e0-49fa-b347-fa907f29bbae nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.343965409 +0000 UTC m=+4.266820204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg5hl" (UniqueName: "kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl") pod "network-check-target-srg6p" (UID: "cb867750-66e0-49fa-b347-fa907f29bbae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.580694 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.580600 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:11 +0000 UTC" deadline="2027-11-10 01:41:27.046568528 +0000 UTC" Apr 17 11:16:13.580694 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.580642 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13718h25m13.465931219s" Apr 17 11:16:13.715928 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.715865 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:13.716095 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.715998 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:13.716160 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.716119 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:13.716223 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:13.716204 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:13.730300 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.729775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" event={"ID":"a995cba3-0edd-41aa-923f-d47b9d050676","Type":"ContainerStarted","Data":"8edefd1211df899c04f409a614a657347885ae638122508226f74aa63f6605e7"} Apr 17 11:16:13.731421 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.731375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" event={"ID":"52957f49-ecc1-4e2d-9165-8b136f11b311","Type":"ContainerStarted","Data":"c0eb68f7b0bf6a0e1c0b31e4bc16d3695d85727cfd4080fa0346f226da2911dc"} Apr 17 11:16:13.733157 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.733131 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mt24" event={"ID":"9b1b3449-3e5b-448f-a69c-f6678b42b96b","Type":"ContainerStarted","Data":"644dcbbbabff6f61056a9f71ed3a1adbd27286a6ad1439348df49963947db3dc"} Apr 17 11:16:13.735489 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.735463 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h7hkk" event={"ID":"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f","Type":"ContainerStarted","Data":"d277b42cf01563534546de4ff5cd4e5734a2b1fa16bc32587c7ba68b2ce17285"} Apr 17 11:16:13.741508 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.741480 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"f98960c0b7c958a1eeb10adaae00ca62ac88295772d8c23dcd5bf624c4d1c591"} Apr 17 11:16:13.745397 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.745361 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rz67d" event={"ID":"c8d89149-a1c2-4e87-941b-ce08710499d4","Type":"ContainerStarted","Data":"f87bcccfc4b213f8df6292ee9c0a570883a6bcc4b92f87c62eb5c385c8bc8173"} Apr 17 11:16:13.749650 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.749611 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" event={"ID":"ab330144-ee92-4afb-ba55-21d109f563b6","Type":"ContainerStarted","Data":"1c1f4e727d2d1594399c3291ec77ca66a4b10d8a8b29dad7ce46c92dadb0d639"} Apr 17 11:16:13.752167 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.752142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kgk95" event={"ID":"0ebdddb3-e6b6-4191-9db1-01e8d15cae25","Type":"ContainerStarted","Data":"36971daca47961072b485dc14b9190a1f8adc5f522935d48ec6b2f2720561288"} Apr 17 11:16:13.758119 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.758094 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" event={"ID":"12581cfe4fc807f31862128cb3a75bcb","Type":"ContainerStarted","Data":"d69f861f63e5ff7bdd5f3aa2e4fdf7e6390f8296d2d3ada4db29230eb37e4021"} Apr 17 11:16:13.767427 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:13.766522 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ndrlj" event={"ID":"255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f","Type":"ContainerStarted","Data":"91e35f6a6e31b022231515a7f9a05c7614df858f5b9223c5c83d69f004c51f5d"} Apr 17 11:16:14.149395 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:14.149354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:14.149604 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.149559 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.149689 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.149620 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:16.149602792 +0000 UTC m=+6.072457591 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.250561 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:14.250522 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:14.250735 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.250686 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:14.250806 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.250748 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret podName:4f77cfcb-60a1-4c91-8f58-dac82efa3fe4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:16.25073041 +0000 UTC m=+6.173585208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret") pod "global-pull-secret-syncer-tvkdx" (UID: "4f77cfcb-60a1-4c91-8f58-dac82efa3fe4") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:14.351951 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:14.351918 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:14.352086 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.352060 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:14.352086 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.352079 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:14.352198 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.352091 2568 projected.go:194] Error preparing data for projected volume kube-api-access-dg5hl for pod openshift-network-diagnostics/network-check-target-srg6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.352198 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.352142 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl podName:cb867750-66e0-49fa-b347-fa907f29bbae nodeName:}" failed. No retries permitted until 2026-04-17 11:16:16.352127508 +0000 UTC m=+6.274982304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg5hl" (UniqueName: "kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl") pod "network-check-target-srg6p" (UID: "cb867750-66e0-49fa-b347-fa907f29bbae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.716000 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:14.715921 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:14.716487 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:14.716064 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:14.796087 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:14.794857 2568 generic.go:358] "Generic (PLEG): container finished" podID="bfc8e7800d795086fdca1d6407793ef9" containerID="13b8b7b12dae7ada526fc996dd68577b1321566bf017a48ac7d2c0f418fb1301" exitCode=0 Apr 17 11:16:14.796087 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:14.795795 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" event={"ID":"bfc8e7800d795086fdca1d6407793ef9","Type":"ContainerDied","Data":"13b8b7b12dae7ada526fc996dd68577b1321566bf017a48ac7d2c0f418fb1301"} Apr 17 11:16:14.813832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:14.813001 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-114.ec2.internal" podStartSLOduration=3.812984984 podStartE2EDuration="3.812984984s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:13.778600107 +0000 UTC m=+3.701454927" watchObservedRunningTime="2026-04-17 11:16:14.812984984 +0000 UTC m=+4.735839803" Apr 17 11:16:15.716116 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:15.715577 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:15.716116 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:15.715767 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:15.716734 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:15.716158 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:15.716734 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:15.716264 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:15.802300 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:15.801514 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" event={"ID":"bfc8e7800d795086fdca1d6407793ef9","Type":"ContainerStarted","Data":"890d453c264600d6c1386fd5536ea9b0ef9f853c4af046de05e6cfd45192d6b3"} Apr 17 11:16:16.167119 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:16.167037 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:16.167282 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.167204 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:16.167282 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.167278 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.167258559 +0000 UTC m=+10.090113364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:16.268331 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:16.268287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:16.268526 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.268491 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:16.268586 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.268553 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret podName:4f77cfcb-60a1-4c91-8f58-dac82efa3fe4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.268535601 +0000 UTC m=+10.191390401 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret") pod "global-pull-secret-syncer-tvkdx" (UID: "4f77cfcb-60a1-4c91-8f58-dac82efa3fe4") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:16.369463 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:16.369361 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:16.369723 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.369695 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:16.369723 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.369724 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:16.369886 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.369740 2568 projected.go:194] Error preparing data for projected volume kube-api-access-dg5hl for pod openshift-network-diagnostics/network-check-target-srg6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:16.369886 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.369802 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl podName:cb867750-66e0-49fa-b347-fa907f29bbae nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.369783661 +0000 UTC m=+10.292638460 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg5hl" (UniqueName: "kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl") pod "network-check-target-srg6p" (UID: "cb867750-66e0-49fa-b347-fa907f29bbae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:16.715962 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:16.715930 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:16.716153 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:16.716061 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:17.715434 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:17.715399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:17.715434 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:17.715435 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:17.715667 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:17.715535 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:17.715736 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:17.715682 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:18.715304 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:18.715269 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:18.715782 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:18.715410 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:19.715237 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:19.715201 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:19.715421 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:19.715212 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:19.715421 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:19.715353 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:19.715810 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:19.715422 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:20.202298 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:20.202213 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:20.202456 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.202425 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:20.202549 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.202507 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:28.202487177 +0000 UTC m=+18.125341986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:20.302876 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:20.302798 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:20.303050 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.302989 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:20.303119 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.303054 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret podName:4f77cfcb-60a1-4c91-8f58-dac82efa3fe4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:28.303035346 +0000 UTC m=+18.225890145 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret") pod "global-pull-secret-syncer-tvkdx" (UID: "4f77cfcb-60a1-4c91-8f58-dac82efa3fe4") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:20.403517 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:20.403470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:20.403685 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.403630 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:20.403685 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.403649 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:20.403685 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.403662 2568 projected.go:194] Error preparing data for projected volume kube-api-access-dg5hl for pod openshift-network-diagnostics/network-check-target-srg6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:20.403834 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.403731 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl podName:cb867750-66e0-49fa-b347-fa907f29bbae nodeName:}" failed. No retries permitted until 2026-04-17 11:16:28.403712888 +0000 UTC m=+18.326567686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg5hl" (UniqueName: "kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl") pod "network-check-target-srg6p" (UID: "cb867750-66e0-49fa-b347-fa907f29bbae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:20.716159 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:20.716121 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:20.716593 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:20.716242 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:21.715091 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:21.715052 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:21.715260 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:21.715106 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:21.715260 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:21.715225 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:21.715358 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:21.715327 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:22.716068 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:22.716022 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:22.716491 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:22.716165 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:23.715175 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:23.715139 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:23.715343 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:23.715139 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:23.715343 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:23.715282 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:23.715494 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:23.715372 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:24.715843 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:24.715801 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:24.716257 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:24.715924 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:25.715377 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:25.715346 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:25.715542 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:25.715357 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:25.715542 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:25.715473 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:25.715608 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:25.715538 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:26.715648 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:26.715440 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:26.716085 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:26.715744 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:27.715430 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:27.715396 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:27.715670 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:27.715398 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:27.715670 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:27.715520 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:27.715670 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:27.715614 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:28.265414 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:28.265367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:28.265599 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.265517 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:28.265599 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.265597 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.265576813 +0000 UTC m=+34.188431611 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:28.366571 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:28.366533 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:28.366739 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.366684 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:28.366787 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.366747 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret podName:4f77cfcb-60a1-4c91-8f58-dac82efa3fe4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.366732982 +0000 UTC m=+34.289587778 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret") pod "global-pull-secret-syncer-tvkdx" (UID: "4f77cfcb-60a1-4c91-8f58-dac82efa3fe4") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:28.467731 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:28.467702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:28.467892 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.467845 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:28.467892 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.467863 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:28.467892 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.467872 2568 projected.go:194] Error preparing data for projected volume kube-api-access-dg5hl for pod openshift-network-diagnostics/network-check-target-srg6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:28.468015 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.467919 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl podName:cb867750-66e0-49fa-b347-fa907f29bbae nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.467905625 +0000 UTC m=+34.390760421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg5hl" (UniqueName: "kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl") pod "network-check-target-srg6p" (UID: "cb867750-66e0-49fa-b347-fa907f29bbae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:28.715812 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:28.715776 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:28.716245 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:28.715899 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:29.715669 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:29.715636 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:29.715829 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:29.715684 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:29.715829 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:29.715787 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:29.716151 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:29.715932 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:30.716019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.715802 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:30.717091 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:30.716077 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:30.829000 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.828963 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"38fb51048a1b71cef74b54c00e19c29f3923788b9d722af28483e06db24a9751"} Apr 17 11:16:30.829000 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.829000 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"8c333d8ed4a8ea0bd5651d92913aaa358db405a17b96b88c6fe6abd89169a48c"} Apr 17 11:16:30.829226 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.829010 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"92e2bb10d683f4e8f26fbec0ba1cfc158ebe6b197d3a1ca67386cff0ba65899f"} Apr 17 11:16:30.829226 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.829020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"c9eb8f8edf75db5e4dca741ba13b81b1a4153a57caefa2875a2410f9082c8ca2"} Apr 17 11:16:30.829226 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.829033 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"98f34af5ca27a40c7be9a0b16d77b01e8a5a0521cafba152e9a4102a7fd80563"} Apr 17 11:16:30.829226 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.829046 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"f1eddaf2a80ac18ff57e240adf0273c3fd89018236642fcb38d509eef6544c51"} Apr 17 11:16:30.830828 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.830792 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" event={"ID":"ab330144-ee92-4afb-ba55-21d109f563b6","Type":"ContainerStarted","Data":"8e9d072c2a5820021bd68db3d6cba72f634d9419a09db9fc7af620a8028dcaad"} Apr 17 11:16:30.832245 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.832157 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kgk95" event={"ID":"0ebdddb3-e6b6-4191-9db1-01e8d15cae25","Type":"ContainerStarted","Data":"82b30b18a518e4b30e9d1e525d37b4f7063ef0364b2aa674d7da0402e0d95079"} Apr 17 11:16:30.833469 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.833449 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ndrlj" event={"ID":"255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f","Type":"ContainerStarted","Data":"2bb130aded9fbb9027285bf356d4667a4070aeff4080a1ff5abb17a1385dbdbd"} Apr 17 11:16:30.834808 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.834783 2568 generic.go:358] "Generic (PLEG): container finished" podID="a995cba3-0edd-41aa-923f-d47b9d050676" containerID="e8961b196f822e4b0362fc63ade674074e691508be3fb45e9b0c4afe40025f69" exitCode=0 Apr 17 11:16:30.834919 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.834856 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" event={"ID":"a995cba3-0edd-41aa-923f-d47b9d050676","Type":"ContainerDied","Data":"e8961b196f822e4b0362fc63ade674074e691508be3fb45e9b0c4afe40025f69"} Apr 17 11:16:30.836128 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.836108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" event={"ID":"52957f49-ecc1-4e2d-9165-8b136f11b311","Type":"ContainerStarted","Data":"f1270170e717a9bb603c7afb790e264909b462e86900fc28edbd1ca3baeb952c"} Apr 17 11:16:30.838532 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.838019 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mt24" event={"ID":"9b1b3449-3e5b-448f-a69c-f6678b42b96b","Type":"ContainerStarted","Data":"430b6120cb8d4e3a0817ecb3f25b0dd7d21ba4229b815249f8eab4a9f00d6aa9"} Apr 17 11:16:30.840453 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.840398 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h7hkk" event={"ID":"61d4b955-d2fa-4cee-a5a9-5bb37d994e5f","Type":"ContainerStarted","Data":"25cb2fb82cb2014d0d641d76b540bfb26709fd9e409c1323d561ae7e5fb59ed8"} Apr 17 11:16:30.846079 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.846048 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-114.ec2.internal" podStartSLOduration=19.846019303 podStartE2EDuration="19.846019303s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:15.820125812 +0000 UTC m=+5.742980631" watchObservedRunningTime="2026-04-17 11:16:30.846019303 +0000 UTC m=+20.768874117" Apr 17 11:16:30.846840 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.846812 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kgk95" podStartSLOduration=8.449910854 podStartE2EDuration="20.846803447s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.213354229 +0000 UTC m=+3.136209024" lastFinishedPulling="2026-04-17 11:16:25.610246822 +0000 UTC m=+15.533101617" observedRunningTime="2026-04-17 11:16:30.846168104 +0000 UTC m=+20.769022918" watchObservedRunningTime="2026-04-17 11:16:30.846803447 +0000 UTC m=+20.769658265" Apr 17 11:16:30.859049 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.859009 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h7hkk" podStartSLOduration=3.018697502 podStartE2EDuration="19.85899587s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.215579852 +0000 UTC m=+3.138434654" lastFinishedPulling="2026-04-17 11:16:30.055878215 +0000 UTC m=+19.978733022" observedRunningTime="2026-04-17 11:16:30.858374409 +0000 UTC m=+20.781229227" watchObservedRunningTime="2026-04-17 11:16:30.85899587 +0000 UTC m=+20.781850688" Apr 17 11:16:30.878021 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.877972 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4mt24" podStartSLOduration=3.737242107 podStartE2EDuration="20.877932818s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.218160416 +0000 UTC m=+3.141015216" lastFinishedPulling="2026-04-17 11:16:30.358851131 +0000 UTC m=+20.281705927" observedRunningTime="2026-04-17 11:16:30.877212179 +0000 UTC m=+20.800066999" watchObservedRunningTime="2026-04-17 11:16:30.877932818 +0000 UTC m=+20.800787635" Apr 17 11:16:30.897488 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.897451 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ndrlj" podStartSLOduration=4.069533986 podStartE2EDuration="20.897440228s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.22483018 +0000 UTC m=+3.147684976" lastFinishedPulling="2026-04-17 11:16:30.052736407 +0000 UTC m=+19.975591218" observedRunningTime="2026-04-17 11:16:30.896535441 +0000 UTC m=+20.819390259" watchObservedRunningTime="2026-04-17 11:16:30.897440228 +0000 UTC m=+20.820295046" Apr 17 11:16:30.911772 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:30.911738 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-j7lpw" podStartSLOduration=4.067642979 podStartE2EDuration="20.911731405s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.219840277 +0000 UTC m=+3.142695087" lastFinishedPulling="2026-04-17 11:16:30.06392871 +0000 UTC m=+19.986783513" observedRunningTime="2026-04-17 11:16:30.911421096 +0000 UTC m=+20.834275915" watchObservedRunningTime="2026-04-17 11:16:30.911731405 +0000 UTC m=+20.834586254" Apr 17 11:16:31.205783 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.205609 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:31.594397 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.594273 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:31.205780531Z","UUID":"b754d9a3-3315-4ed3-9974-6f5886785950","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:31.597108 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.596869 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:31.597108 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.596900 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:31.715719 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.715689 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:31.715879 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.715730 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:31.715879 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:31.715799 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:31.716009 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:31.715878 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:31.845625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.845566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rz67d" event={"ID":"c8d89149-a1c2-4e87-941b-ce08710499d4","Type":"ContainerStarted","Data":"53ca589a841bdd85dd37543dc7c6bf2069585dc51e66db029eb7b0b574803ef9"} Apr 17 11:16:31.848879 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.848840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" event={"ID":"ab330144-ee92-4afb-ba55-21d109f563b6","Type":"ContainerStarted","Data":"7b7f460e8d9eff80ca8d0bf9a8ffdbcdf0af3846b546c53ef987bb592362e2b9"} Apr 17 11:16:31.865510 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:31.865461 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rz67d" podStartSLOduration=4.031859428 podStartE2EDuration="20.865445082s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.222269554 +0000 UTC m=+3.145124352" lastFinishedPulling="2026-04-17 11:16:30.055855193 +0000 UTC m=+19.978710006" observedRunningTime="2026-04-17 11:16:31.865055741 +0000 UTC m=+21.787910580" watchObservedRunningTime="2026-04-17 11:16:31.865445082 +0000 UTC m=+21.788299903" Apr 17 11:16:32.715086 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:32.715061 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:32.715227 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:32.715183 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:32.854182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:32.854149 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"48e3a254e0350e3fc91826223113f70bb538bedebcf4e75cbcf80a993d24f790"} Apr 17 11:16:32.856240 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:32.856210 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" event={"ID":"ab330144-ee92-4afb-ba55-21d109f563b6","Type":"ContainerStarted","Data":"ec6808e0b913cd13845326698eccc166718835f124d5b6d22c959a6b79be5589"} Apr 17 11:16:32.874529 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:32.874480 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qs97w" podStartSLOduration=4.132069854 podStartE2EDuration="22.874463301s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.221400492 +0000 UTC m=+3.144255288" lastFinishedPulling="2026-04-17 11:16:31.963793936 +0000 UTC m=+21.886648735" observedRunningTime="2026-04-17 11:16:32.873362864 +0000 UTC m=+22.796217681" watchObservedRunningTime="2026-04-17 11:16:32.874463301 +0000 UTC m=+22.797318120" Apr 17 11:16:33.564822 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:33.564792 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:33.565618 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:33.565598 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:33.715919 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:33.715891 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:33.716090 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:33.715901 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:33.716090 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:33.716013 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:33.716090 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:33.716081 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:33.858589 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:33.858524 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:33.859179 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:33.859007 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ndrlj" Apr 17 11:16:34.715246 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:34.715194 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:34.715437 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:34.715339 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:35.715418 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:35.715240 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:35.715869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:35.715247 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:35.715869 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:35.715603 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:35.715869 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:35.715484 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:35.864212 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:35.864181 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" event={"ID":"f0fa0497-6cc8-4a84-b902-a5b9ad486d28","Type":"ContainerStarted","Data":"c98ef6657305ad8d68d4aa2ae64686eb34ff9d5741918f59d48546902cacdbc1"} Apr 17 11:16:35.864466 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:35.864448 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:35.865980 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:35.865954 2568 generic.go:358] "Generic (PLEG): container finished" podID="a995cba3-0edd-41aa-923f-d47b9d050676" containerID="1bfd74bcd6305aed1c2227fb4612a0f2138d0dbb468cc7fd46314734440d6d5b" exitCode=0 Apr 17 11:16:35.866077 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:35.866047 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" event={"ID":"a995cba3-0edd-41aa-923f-d47b9d050676","Type":"ContainerDied","Data":"1bfd74bcd6305aed1c2227fb4612a0f2138d0dbb468cc7fd46314734440d6d5b"} Apr 17 11:16:35.878675 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:35.878656 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:35.890660 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:35.890621 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" podStartSLOduration=7.902087652 podStartE2EDuration="24.890609721s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.223958841 +0000 UTC m=+3.146813637" lastFinishedPulling="2026-04-17 11:16:30.212480905 +0000 UTC m=+20.135335706" observedRunningTime="2026-04-17 11:16:35.889021577 +0000 UTC m=+25.811876394" watchObservedRunningTime="2026-04-17 11:16:35.890609721 +0000 UTC m=+25.813464576" Apr 17 11:16:36.715722 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.715501 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:36.718413 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:36.716105 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:36.799434 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.799410 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tvkdx"] Apr 17 11:16:36.799557 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.799508 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:36.799605 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:36.799588 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:36.802232 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.802208 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-srg6p"] Apr 17 11:16:36.802864 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.802839 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4d5rw"] Apr 17 11:16:36.802943 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.802932 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:36.803041 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:36.803022 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:36.869718 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.869693 2568 generic.go:358] "Generic (PLEG): container finished" podID="a995cba3-0edd-41aa-923f-d47b9d050676" containerID="74937e78eced02e79e42e2615ad67e3349c9c7876451ca4e839109f3b2a6d054" exitCode=0 Apr 17 11:16:36.869833 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.869775 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:36.869833 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.869777 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" event={"ID":"a995cba3-0edd-41aa-923f-d47b9d050676","Type":"ContainerDied","Data":"74937e78eced02e79e42e2615ad67e3349c9c7876451ca4e839109f3b2a6d054"} Apr 17 11:16:36.869955 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.869938 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:36.870020 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:36.869996 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:36.870878 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.870377 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:36.884234 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:36.884214 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:37.873278 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:37.873248 2568 generic.go:358] "Generic (PLEG): container finished" podID="a995cba3-0edd-41aa-923f-d47b9d050676" containerID="dc9dd00b1fcb736bcd59a0328676fef771ddbfd9f034d6d3cfebe17292731d79" exitCode=0 Apr 17 11:16:37.873650 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:37.873336 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" event={"ID":"a995cba3-0edd-41aa-923f-d47b9d050676","Type":"ContainerDied","Data":"dc9dd00b1fcb736bcd59a0328676fef771ddbfd9f034d6d3cfebe17292731d79"} Apr 17 11:16:37.873650 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:37.873415 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:38.715494 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:38.715406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:38.715494 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:38.715441 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:38.715685 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:38.715409 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:38.715685 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:38.715552 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:38.715685 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:38.715635 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:38.715840 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:38.715727 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:38.875097 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:38.875069 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:40.716658 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:40.716627 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:40.717316 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:40.716728 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:40.717316 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:40.716817 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:40.717316 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:40.716918 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:40.717316 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:40.716947 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:40.717316 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:40.716968 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:41.137914 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:41.137866 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:16:41.138128 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:41.138110 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:41.152480 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:41.152426 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" podUID="f0fa0497-6cc8-4a84-b902-a5b9ad486d28" containerName="ovnkube-controller" probeResult="failure" output="" Apr 17 11:16:41.161911 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:41.161882 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" podUID="f0fa0497-6cc8-4a84-b902-a5b9ad486d28" containerName="ovnkube-controller" probeResult="failure" output="" Apr 17 11:16:42.715896 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.715858 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:42.716342 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.715855 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:42.716342 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:42.715992 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:16:42.716342 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.715872 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:42.716342 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:42.716055 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tvkdx" podUID="4f77cfcb-60a1-4c91-8f58-dac82efa3fe4" Apr 17 11:16:42.716342 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:42.716141 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-srg6p" podUID="cb867750-66e0-49fa-b347-fa907f29bbae" Apr 17 11:16:42.901468 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.901440 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-114.ec2.internal" event="NodeReady" Apr 17 11:16:42.901613 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.901573 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:42.937251 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.937216 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-799f5b7986-bv6sb"] Apr 17 11:16:42.967417 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.967330 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-799f5b7986-bv6sb"] Apr 17 11:16:42.967417 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.967365 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hst8c"] Apr 17 11:16:42.967603 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.967504 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:42.970306 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.970265 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:16:42.970484 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.970348 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:16:42.970484 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.970436 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:16:42.970759 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.970732 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xz4zw\"" Apr 17 11:16:42.977057 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.976619 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:16:42.982493 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.982471 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8jgxz"] Apr 17 11:16:42.982596 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.982573 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:42.985625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.985605 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bd2vh\"" Apr 17 11:16:42.985711 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.985629 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:42.985711 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.985608 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:42.987031 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.986998 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:42.987128 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987060 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwkll\" (UniqueName: \"kubernetes.io/projected/9443878d-c2b0-4771-b41e-f23e0fff86a4-kube-api-access-cwkll\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:42.987196 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-image-registry-private-configuration\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:42.987196 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987162 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-ca-trust-extracted\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:42.987196 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987186 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-trusted-ca\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:42.987336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987207 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-bound-sa-token\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:42.987336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9443878d-c2b0-4771-b41e-f23e0fff86a4-tmp-dir\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:42.987336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987264 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-certificates\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:42.987336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987281 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9443878d-c2b0-4771-b41e-f23e0fff86a4-config-volume\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:42.987336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987318 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:42.987591 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-installation-pull-secrets\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:42.987591 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:42.987378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq5lk\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-kube-api-access-zq5lk\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.006038 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.006018 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hst8c"] Apr 17 11:16:43.006136 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.006046 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8jgxz"] Apr 17 11:16:43.006201 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.006150 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:43.008697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.008677 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:43.008788 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.008769 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:43.008924 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.008908 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j8tns\"" Apr 17 11:16:43.009188 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.009174 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:43.088142 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088098 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwkll\" (UniqueName: \"kubernetes.io/projected/9443878d-c2b0-4771-b41e-f23e0fff86a4-kube-api-access-cwkll\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:43.088313 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088194 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-image-registry-private-configuration\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088313 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-ca-trust-extracted\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088313 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-trusted-ca\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088313 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088275 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-bound-sa-token\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9443878d-c2b0-4771-b41e-f23e0fff86a4-tmp-dir\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:43.088543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088337 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-certificates\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9443878d-c2b0-4771-b41e-f23e0fff86a4-config-volume\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:43.088543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxld7\" (UniqueName: \"kubernetes.io/projected/015be0d7-ff4e-4b65-b3ee-73d579ba395e-kube-api-access-cxld7\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:43.088543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088463 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:43.088543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088512 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-installation-pull-secrets\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088543 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq5lk\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-kube-api-access-zq5lk\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:43.088995 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.088684 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:43.088995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088724 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-ca-trust-extracted\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.088995 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.088738 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls podName:9443878d-c2b0-4771-b41e-f23e0fff86a4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:43.588721841 +0000 UTC m=+33.511576655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls") pod "dns-default-hst8c" (UID: "9443878d-c2b0-4771-b41e-f23e0fff86a4") : secret "dns-default-metrics-tls" not found Apr 17 11:16:43.088995 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.088835 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:43.088995 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.088848 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799f5b7986-bv6sb: secret "image-registry-tls" not found Apr 17 11:16:43.088995 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.088891 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls podName:16f785fc-52f1-4f27-a4e9-f56d09ae67b2 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:43.588874766 +0000 UTC m=+33.511729577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls") pod "image-registry-799f5b7986-bv6sb" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2") : secret "image-registry-tls" not found Apr 17 11:16:43.088995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.088897 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9443878d-c2b0-4771-b41e-f23e0fff86a4-tmp-dir\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:43.089396 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.089110 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9443878d-c2b0-4771-b41e-f23e0fff86a4-config-volume\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:43.089396 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.089134 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-certificates\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.093497 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.093340 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-image-registry-private-configuration\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.093602 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.093376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-installation-pull-secrets\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.095509 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.095451 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-trusted-ca\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.097682 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.097656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-bound-sa-token\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.097877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.097831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwkll\" (UniqueName: \"kubernetes.io/projected/9443878d-c2b0-4771-b41e-f23e0fff86a4-kube-api-access-cwkll\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:43.098100 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.098062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq5lk\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-kube-api-access-zq5lk\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.189518 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.189475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxld7\" (UniqueName: \"kubernetes.io/projected/015be0d7-ff4e-4b65-b3ee-73d579ba395e-kube-api-access-cxld7\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:43.189700 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.189547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:43.189700 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.189674 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:43.189814 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.189747 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert podName:015be0d7-ff4e-4b65-b3ee-73d579ba395e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:43.689727061 +0000 UTC m=+33.612581872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert") pod "ingress-canary-8jgxz" (UID: "015be0d7-ff4e-4b65-b3ee-73d579ba395e") : secret "canary-serving-cert" not found Apr 17 11:16:43.201945 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.201914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxld7\" (UniqueName: \"kubernetes.io/projected/015be0d7-ff4e-4b65-b3ee-73d579ba395e-kube-api-access-cxld7\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:43.592759 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.592726 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:43.592922 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.592876 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:43.592922 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.592887 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:43.593038 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.592954 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls podName:9443878d-c2b0-4771-b41e-f23e0fff86a4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.592933512 +0000 UTC m=+34.515788309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls") pod "dns-default-hst8c" (UID: "9443878d-c2b0-4771-b41e-f23e0fff86a4") : secret "dns-default-metrics-tls" not found Apr 17 11:16:43.593038 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.592986 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:43.593038 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.592998 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799f5b7986-bv6sb: secret "image-registry-tls" not found Apr 17 11:16:43.593038 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.593037 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls podName:16f785fc-52f1-4f27-a4e9-f56d09ae67b2 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.593027395 +0000 UTC m=+34.515882191 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls") pod "image-registry-799f5b7986-bv6sb" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2") : secret "image-registry-tls" not found Apr 17 11:16:43.693317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:43.693290 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:43.693447 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.693433 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:43.693490 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:43.693482 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert podName:015be0d7-ff4e-4b65-b3ee-73d579ba395e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.693467381 +0000 UTC m=+34.616322177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert") pod "ingress-canary-8jgxz" (UID: "015be0d7-ff4e-4b65-b3ee-73d579ba395e") : secret "canary-serving-cert" not found Apr 17 11:16:44.296588 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.296556 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:44.297007 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.296699 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:44.297007 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.296773 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:16.296754402 +0000 UTC m=+66.219609199 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:44.397768 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.397695 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:44.397900 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.397813 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:44.397900 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.397875 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret podName:4f77cfcb-60a1-4c91-8f58-dac82efa3fe4 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:16.397858476 +0000 UTC m=+66.320713274 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret") pod "global-pull-secret-syncer-tvkdx" (UID: "4f77cfcb-60a1-4c91-8f58-dac82efa3fe4") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:44.498857 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.498826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:44.499005 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.498961 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:44.499005 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.498975 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:44.499005 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.498983 2568 projected.go:194] Error preparing data for projected volume kube-api-access-dg5hl for pod openshift-network-diagnostics/network-check-target-srg6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:44.499104 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.499030 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl podName:cb867750-66e0-49fa-b347-fa907f29bbae nodeName:}" failed. No retries permitted until 2026-04-17 11:17:16.499015793 +0000 UTC m=+66.421870592 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dg5hl" (UniqueName: "kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl") pod "network-check-target-srg6p" (UID: "cb867750-66e0-49fa-b347-fa907f29bbae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:44.599872 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.599844 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:44.600002 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.599943 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:44.600002 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.599994 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:44.600064 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.600038 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:44.600064 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.600049 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799f5b7986-bv6sb: secret "image-registry-tls" not found Apr 17 11:16:44.600064 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.600059 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls podName:9443878d-c2b0-4771-b41e-f23e0fff86a4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:46.60003987 +0000 UTC m=+36.522894673 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls") pod "dns-default-hst8c" (UID: "9443878d-c2b0-4771-b41e-f23e0fff86a4") : secret "dns-default-metrics-tls" not found Apr 17 11:16:44.600161 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.600078 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls podName:16f785fc-52f1-4f27-a4e9-f56d09ae67b2 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:46.600069174 +0000 UTC m=+36.522923970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls") pod "image-registry-799f5b7986-bv6sb" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2") : secret "image-registry-tls" not found Apr 17 11:16:44.701077 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.701007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:44.701206 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.701147 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:44.701206 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:44.701206 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert podName:015be0d7-ff4e-4b65-b3ee-73d579ba395e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:46.701192475 +0000 UTC m=+36.624047275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert") pod "ingress-canary-8jgxz" (UID: "015be0d7-ff4e-4b65-b3ee-73d579ba395e") : secret "canary-serving-cert" not found Apr 17 11:16:44.715596 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.715570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:16:44.715714 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.715570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:16:44.715765 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.715571 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:16:44.725022 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.725004 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.725176 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.725162 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nzl46\"" Apr 17 11:16:44.725605 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.725589 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:16:44.725675 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.725657 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.725743 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.725725 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-g9l5h\"" Apr 17 11:16:44.725786 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.725762 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:16:44.886806 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.886774 2568 generic.go:358] "Generic (PLEG): container finished" podID="a995cba3-0edd-41aa-923f-d47b9d050676" containerID="0923bb9a93d5c7ee05f31a7f51e6cae77f3bbd0dd78cdd6ec37e10b97b9cb1bf" exitCode=0 Apr 17 11:16:44.886949 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:44.886823 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" event={"ID":"a995cba3-0edd-41aa-923f-d47b9d050676","Type":"ContainerDied","Data":"0923bb9a93d5c7ee05f31a7f51e6cae77f3bbd0dd78cdd6ec37e10b97b9cb1bf"} Apr 17 11:16:45.890981 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:45.890951 2568 generic.go:358] "Generic (PLEG): container finished" podID="a995cba3-0edd-41aa-923f-d47b9d050676" containerID="1de7c4719821c76456736514cac00684343125554b0bcb3c07e98d3d338f4f17" exitCode=0 Apr 17 11:16:45.890981 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:45.890989 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" event={"ID":"a995cba3-0edd-41aa-923f-d47b9d050676","Type":"ContainerDied","Data":"1de7c4719821c76456736514cac00684343125554b0bcb3c07e98d3d338f4f17"} Apr 17 11:16:46.615362 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:46.615275 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:46.615362 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:46.615330 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:46.615610 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:46.615460 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:46.615610 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:46.615490 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799f5b7986-bv6sb: secret "image-registry-tls" not found Apr 17 11:16:46.615610 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:46.615552 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls podName:16f785fc-52f1-4f27-a4e9-f56d09ae67b2 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:50.615533196 +0000 UTC m=+40.538388006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls") pod "image-registry-799f5b7986-bv6sb" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2") : secret "image-registry-tls" not found Apr 17 11:16:46.615610 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:46.615466 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:46.615610 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:46.615590 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls podName:9443878d-c2b0-4771-b41e-f23e0fff86a4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:50.615579883 +0000 UTC m=+40.538434680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls") pod "dns-default-hst8c" (UID: "9443878d-c2b0-4771-b41e-f23e0fff86a4") : secret "dns-default-metrics-tls" not found Apr 17 11:16:46.716322 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:46.716284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:46.716515 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:46.716465 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:46.716582 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:46.716538 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert podName:015be0d7-ff4e-4b65-b3ee-73d579ba395e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:50.7165189 +0000 UTC m=+40.639373696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert") pod "ingress-canary-8jgxz" (UID: "015be0d7-ff4e-4b65-b3ee-73d579ba395e") : secret "canary-serving-cert" not found Apr 17 11:16:46.895188 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:46.895116 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" event={"ID":"a995cba3-0edd-41aa-923f-d47b9d050676","Type":"ContainerStarted","Data":"368c5e33ca39cebc90d253c3535b9f7cd1c800cb0864a1dd8d70af58034ba9d4"} Apr 17 11:16:46.924318 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:46.924277 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nw5h9" podStartSLOduration=6.325177626 podStartE2EDuration="36.924260452s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.222449884 +0000 UTC m=+3.145304685" lastFinishedPulling="2026-04-17 11:16:43.821532715 +0000 UTC m=+33.744387511" observedRunningTime="2026-04-17 11:16:46.922705673 +0000 UTC m=+36.845560490" watchObservedRunningTime="2026-04-17 11:16:46.924260452 +0000 UTC m=+36.847115292" Apr 17 11:16:50.646669 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:50.646637 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:50.647103 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:50.646687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:50.647103 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:50.646794 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:50.647103 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:50.646806 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:50.647103 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:50.646827 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799f5b7986-bv6sb: secret "image-registry-tls" not found Apr 17 11:16:50.647103 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:50.646839 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls podName:9443878d-c2b0-4771-b41e-f23e0fff86a4 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:58.64682615 +0000 UTC m=+48.569680946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls") pod "dns-default-hst8c" (UID: "9443878d-c2b0-4771-b41e-f23e0fff86a4") : secret "dns-default-metrics-tls" not found Apr 17 11:16:50.647103 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:50.646880 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls podName:16f785fc-52f1-4f27-a4e9-f56d09ae67b2 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:58.64686245 +0000 UTC m=+48.569717254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls") pod "image-registry-799f5b7986-bv6sb" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2") : secret "image-registry-tls" not found Apr 17 11:16:50.747238 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:50.747211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:50.747370 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:50.747310 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:50.747370 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:50.747350 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert podName:015be0d7-ff4e-4b65-b3ee-73d579ba395e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:58.747339061 +0000 UTC m=+48.670193857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert") pod "ingress-canary-8jgxz" (UID: "015be0d7-ff4e-4b65-b3ee-73d579ba395e") : secret "canary-serving-cert" not found Apr 17 11:16:58.706641 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:58.706601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:16:58.707084 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:58.706684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:16:58.707084 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:58.706768 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:58.707084 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:58.706769 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:58.707084 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:58.706863 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls podName:9443878d-c2b0-4771-b41e-f23e0fff86a4 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:14.706841994 +0000 UTC m=+64.629696790 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls") pod "dns-default-hst8c" (UID: "9443878d-c2b0-4771-b41e-f23e0fff86a4") : secret "dns-default-metrics-tls" not found Apr 17 11:16:58.707084 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:58.706778 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799f5b7986-bv6sb: secret "image-registry-tls" not found Apr 17 11:16:58.707084 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:58.706904 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls podName:16f785fc-52f1-4f27-a4e9-f56d09ae67b2 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:14.706895066 +0000 UTC m=+64.629749866 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls") pod "image-registry-799f5b7986-bv6sb" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2") : secret "image-registry-tls" not found Apr 17 11:16:58.807242 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:16:58.807211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:16:58.807404 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:58.807328 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:58.807481 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:16:58.807411 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert podName:015be0d7-ff4e-4b65-b3ee-73d579ba395e nodeName:}" failed. No retries permitted until 2026-04-17 11:17:14.807375285 +0000 UTC m=+64.730230083 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert") pod "ingress-canary-8jgxz" (UID: "015be0d7-ff4e-4b65-b3ee-73d579ba395e") : secret "canary-serving-cert" not found Apr 17 11:17:11.162722 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:11.162691 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sps6r" Apr 17 11:17:14.720399 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:14.720355 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:17:14.720752 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:14.720421 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:17:14.720752 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:14.720491 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:17:14.720752 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:14.720508 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799f5b7986-bv6sb: secret "image-registry-tls" not found Apr 17 11:17:14.720752 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:14.720522 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:14.720752 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:14.720568 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls podName:16f785fc-52f1-4f27-a4e9-f56d09ae67b2 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:46.720553567 +0000 UTC m=+96.643408362 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls") pod "image-registry-799f5b7986-bv6sb" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2") : secret "image-registry-tls" not found Apr 17 11:17:14.720752 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:14.720581 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls podName:9443878d-c2b0-4771-b41e-f23e0fff86a4 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:46.720575739 +0000 UTC m=+96.643430535 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls") pod "dns-default-hst8c" (UID: "9443878d-c2b0-4771-b41e-f23e0fff86a4") : secret "dns-default-metrics-tls" not found Apr 17 11:17:14.821226 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:14.821198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:17:14.821363 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:14.821307 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:14.821422 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:14.821367 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert podName:015be0d7-ff4e-4b65-b3ee-73d579ba395e nodeName:}" failed. No retries permitted until 2026-04-17 11:17:46.821353349 +0000 UTC m=+96.744208149 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert") pod "ingress-canary-8jgxz" (UID: "015be0d7-ff4e-4b65-b3ee-73d579ba395e") : secret "canary-serving-cert" not found Apr 17 11:17:16.330916 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.330868 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:17:16.333846 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.333824 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:16.341102 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:16.341081 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:16.341184 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:16.341147 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:20.341126894 +0000 UTC m=+130.263981689 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : secret "metrics-daemon-secret" not found Apr 17 11:17:16.431438 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.431410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:17:16.434468 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.434449 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:17:16.444782 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.444760 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f77cfcb-60a1-4c91-8f58-dac82efa3fe4-original-pull-secret\") pod \"global-pull-secret-syncer-tvkdx\" (UID: \"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4\") " pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:17:16.530745 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.530706 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tvkdx" Apr 17 11:17:16.532456 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.532437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:17:16.535489 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.535472 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:16.545724 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.545704 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:16.555925 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.555904 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg5hl\" (UniqueName: \"kubernetes.io/projected/cb867750-66e0-49fa-b347-fa907f29bbae-kube-api-access-dg5hl\") pod \"network-check-target-srg6p\" (UID: \"cb867750-66e0-49fa-b347-fa907f29bbae\") " pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:17:16.685588 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.685556 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tvkdx"] Apr 17 11:17:16.827845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.827816 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nzl46\"" Apr 17 11:17:16.835198 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.835132 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:17:16.948480 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.948427 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-srg6p"] Apr 17 11:17:16.949604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:16.949578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tvkdx" event={"ID":"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4","Type":"ContainerStarted","Data":"2090fb8d15d10280cf22377694e00d5b7304c84adfa9d996fb669d92124958fb"} Apr 17 11:17:16.952011 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:17:16.951988 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb867750_66e0_49fa_b347_fa907f29bbae.slice/crio-d2f26e42ff81f62a698b0d238d067b6e4a39f84e13bd0c12812832dea389e1f6 WatchSource:0}: Error finding container d2f26e42ff81f62a698b0d238d067b6e4a39f84e13bd0c12812832dea389e1f6: Status 404 returned error can't find the container with id d2f26e42ff81f62a698b0d238d067b6e4a39f84e13bd0c12812832dea389e1f6 Apr 17 11:17:17.952929 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:17.952894 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-srg6p" event={"ID":"cb867750-66e0-49fa-b347-fa907f29bbae","Type":"ContainerStarted","Data":"d2f26e42ff81f62a698b0d238d067b6e4a39f84e13bd0c12812832dea389e1f6"} Apr 17 11:17:21.963678 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:21.963641 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tvkdx" event={"ID":"4f77cfcb-60a1-4c91-8f58-dac82efa3fe4","Type":"ContainerStarted","Data":"acd47ccef5398a3044b4caa1db2d74997b8715052c9eb6525495eab8033078e3"} Apr 17 11:17:21.967290 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:21.967263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-srg6p" event={"ID":"cb867750-66e0-49fa-b347-fa907f29bbae","Type":"ContainerStarted","Data":"1d4101f9d20d05850d9965ebfb6fdf3b93b8e96130f28f349b3bc3e53dd020df"} Apr 17 11:17:21.967440 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:21.967425 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:17:21.979832 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:21.979793 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tvkdx" podStartSLOduration=66.584112272 podStartE2EDuration="1m10.979781227s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:17:16.691508469 +0000 UTC m=+66.614363268" lastFinishedPulling="2026-04-17 11:17:21.087177415 +0000 UTC m=+71.010032223" observedRunningTime="2026-04-17 11:17:21.979156 +0000 UTC m=+71.902010818" watchObservedRunningTime="2026-04-17 11:17:21.979781227 +0000 UTC m=+71.902636045" Apr 17 11:17:21.993880 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:21.993837 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-srg6p" podStartSLOduration=66.864479068 podStartE2EDuration="1m10.993825865s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:17:16.953703675 +0000 UTC m=+66.876558474" lastFinishedPulling="2026-04-17 11:17:21.083050472 +0000 UTC m=+71.005905271" observedRunningTime="2026-04-17 11:17:21.993352294 +0000 UTC m=+71.916207112" watchObservedRunningTime="2026-04-17 11:17:21.993825865 +0000 UTC m=+71.916680682" Apr 17 11:17:46.742513 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:46.742429 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:17:46.742513 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:46.742474 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:17:46.742906 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:46.742570 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:46.742906 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:46.742626 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls podName:9443878d-c2b0-4771-b41e-f23e0fff86a4 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.742613854 +0000 UTC m=+160.665468649 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls") pod "dns-default-hst8c" (UID: "9443878d-c2b0-4771-b41e-f23e0fff86a4") : secret "dns-default-metrics-tls" not found Apr 17 11:17:46.742906 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:46.742568 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:17:46.742906 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:46.742665 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799f5b7986-bv6sb: secret "image-registry-tls" not found Apr 17 11:17:46.742906 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:46.742718 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls podName:16f785fc-52f1-4f27-a4e9-f56d09ae67b2 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.742706563 +0000 UTC m=+160.665561359 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls") pod "image-registry-799f5b7986-bv6sb" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2") : secret "image-registry-tls" not found Apr 17 11:17:46.843756 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:46.843729 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:17:46.843893 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:46.843824 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:46.843893 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:17:46.843867 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert podName:015be0d7-ff4e-4b65-b3ee-73d579ba395e nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.843855578 +0000 UTC m=+160.766710373 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert") pod "ingress-canary-8jgxz" (UID: "015be0d7-ff4e-4b65-b3ee-73d579ba395e") : secret "canary-serving-cert" not found Apr 17 11:17:52.972524 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:17:52.972498 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-srg6p" Apr 17 11:18:08.023995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.023955 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr"] Apr 17 11:18:08.026731 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.026713 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.031984 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.031946 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 11:18:08.031984 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.031968 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:18:08.031984 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.031982 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-v296b\"" Apr 17 11:18:08.032214 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.032030 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 11:18:08.032214 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.032088 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:18:08.036178 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.036158 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr"] Apr 17 11:18:08.128624 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.127265 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8q9jb"] Apr 17 11:18:08.130683 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.130659 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.134326 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.134298 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-s4lgk\"" Apr 17 11:18:08.134326 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.134313 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 11:18:08.134512 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.134413 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 11:18:08.134512 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.134454 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:18:08.134512 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.134454 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:18:08.139280 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.139263 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 11:18:08.141436 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.141415 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8q9jb"] Apr 17 11:18:08.192539 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.192505 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.192539 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.192538 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7n85\" (UniqueName: \"kubernetes.io/projected/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-kube-api-access-t7n85\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.192730 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.192599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.293972 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.293939 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-snapshots\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.294077 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.293981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-service-ca-bundle\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.294077 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.294035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.294077 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.294055 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7n85\" (UniqueName: \"kubernetes.io/projected/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-kube-api-access-t7n85\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.294177 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.294127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.294177 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.294158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-tmp\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.294237 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:08.294179 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:08.294237 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.294196 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.294237 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.294219 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-serving-cert\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.294325 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:08.294250 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls podName:c3bbc74d-0a7a-4056-af5e-f1e1491bfed5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:08.794231191 +0000 UTC m=+118.717085992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2wsvr" (UID: "c3bbc74d-0a7a-4056-af5e-f1e1491bfed5") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:08.294369 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.294319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcfjf\" (UniqueName: \"kubernetes.io/projected/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-kube-api-access-dcfjf\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.294821 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.294801 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.302946 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.302926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7n85\" (UniqueName: \"kubernetes.io/projected/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-kube-api-access-t7n85\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.395168 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395135 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcfjf\" (UniqueName: \"kubernetes.io/projected/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-kube-api-access-dcfjf\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.395351 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-snapshots\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.395351 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-service-ca-bundle\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.395494 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-tmp\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.395494 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.395494 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-serving-cert\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.395754 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-service-ca-bundle\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.395789 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395774 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-tmp\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.395872 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.395849 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-snapshots\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.396612 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.396596 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.397744 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.397726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-serving-cert\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.403908 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.403882 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcfjf\" (UniqueName: \"kubernetes.io/projected/bfdb9877-c4ed-40e3-9a4c-80fe70a2f755-kube-api-access-dcfjf\") pod \"insights-operator-585dfdc468-8q9jb\" (UID: \"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755\") " pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.438927 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.438892 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8q9jb" Apr 17 11:18:08.549668 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.549566 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8q9jb"] Apr 17 11:18:08.553703 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:08.553670 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfdb9877_c4ed_40e3_9a4c_80fe70a2f755.slice/crio-803eba8ed9ab722988c465bf181d306bab8cc75c68442af4a7012b858bc2aaf9 WatchSource:0}: Error finding container 803eba8ed9ab722988c465bf181d306bab8cc75c68442af4a7012b858bc2aaf9: Status 404 returned error can't find the container with id 803eba8ed9ab722988c465bf181d306bab8cc75c68442af4a7012b858bc2aaf9 Apr 17 11:18:08.798737 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:08.798708 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:08.798870 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:08.798820 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:08.798870 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:08.798868 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls podName:c3bbc74d-0a7a-4056-af5e-f1e1491bfed5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:09.798856191 +0000 UTC m=+119.721710986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2wsvr" (UID: "c3bbc74d-0a7a-4056-af5e-f1e1491bfed5") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:09.056478 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:09.056450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8q9jb" event={"ID":"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755","Type":"ContainerStarted","Data":"803eba8ed9ab722988c465bf181d306bab8cc75c68442af4a7012b858bc2aaf9"} Apr 17 11:18:09.804923 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:09.804881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:09.805090 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:09.805049 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:09.805166 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:09.805121 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls podName:c3bbc74d-0a7a-4056-af5e-f1e1491bfed5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:11.805105294 +0000 UTC m=+121.727960090 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2wsvr" (UID: "c3bbc74d-0a7a-4056-af5e-f1e1491bfed5") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:11.060929 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:11.060896 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8q9jb" event={"ID":"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755","Type":"ContainerStarted","Data":"0b74235cd70e20ea9064fab5c17d84647200104a014c0eb9b86cdf644db8a6b3"} Apr 17 11:18:11.077629 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:11.077585 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-8q9jb" podStartSLOduration=1.396070526 podStartE2EDuration="3.077568903s" podCreationTimestamp="2026-04-17 11:18:08 +0000 UTC" firstStartedPulling="2026-04-17 11:18:08.55544512 +0000 UTC m=+118.478299917" lastFinishedPulling="2026-04-17 11:18:10.236943498 +0000 UTC m=+120.159798294" observedRunningTime="2026-04-17 11:18:11.07724243 +0000 UTC m=+121.000097249" watchObservedRunningTime="2026-04-17 11:18:11.077568903 +0000 UTC m=+121.000423721" Apr 17 11:18:11.820682 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:11.820637 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:11.820894 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:11.820772 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:11.820894 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:11.820856 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls podName:c3bbc74d-0a7a-4056-af5e-f1e1491bfed5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:15.820834981 +0000 UTC m=+125.743689777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2wsvr" (UID: "c3bbc74d-0a7a-4056-af5e-f1e1491bfed5") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:13.201696 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:13.201666 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h7hkk_61d4b955-d2fa-4cee-a5a9-5bb37d994e5f/dns-node-resolver/0.log" Apr 17 11:18:14.202243 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:14.202216 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kgk95_0ebdddb3-e6b6-4191-9db1-01e8d15cae25/node-ca/0.log" Apr 17 11:18:15.763124 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.763089 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp"] Apr 17 11:18:15.766091 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.766075 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:15.768693 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.768672 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:15.769761 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.769742 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 11:18:15.769817 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.769760 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:15.769817 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.769770 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hrl8t\"" Apr 17 11:18:15.773590 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.773570 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp"] Apr 17 11:18:15.852808 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.852785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:15.852940 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:15.852880 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:15.852940 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:15.852937 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls podName:c3bbc74d-0a7a-4056-af5e-f1e1491bfed5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:23.85292434 +0000 UTC m=+133.775779136 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2wsvr" (UID: "c3bbc74d-0a7a-4056-af5e-f1e1491bfed5") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:15.868512 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.868489 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85"] Apr 17 11:18:15.871365 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.871346 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85" Apr 17 11:18:15.874177 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.874156 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:15.874256 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.874209 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:15.874312 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.874210 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-4vnws\"" Apr 17 11:18:15.879450 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.879429 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85"] Apr 17 11:18:15.953696 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.953669 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg9jl\" (UniqueName: \"kubernetes.io/projected/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-kube-api-access-fg9jl\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:15.953803 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:15.953700 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:16.054451 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.054427 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:16.054572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.054557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg9jl\" (UniqueName: \"kubernetes.io/projected/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-kube-api-access-fg9jl\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:16.054610 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:16.054578 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:16.054610 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.054593 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmv8\" (UniqueName: \"kubernetes.io/projected/c336d09a-931b-409b-a010-6bf7cb87a9a9-kube-api-access-8bmv8\") pod \"volume-data-source-validator-7c6cbb6c87-r5k85\" (UID: \"c336d09a-931b-409b-a010-6bf7cb87a9a9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85" Apr 17 11:18:16.054675 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:16.054646 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls podName:a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:16.554628808 +0000 UTC m=+126.477483615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rrkhp" (UID: "a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f") : secret "samples-operator-tls" not found Apr 17 11:18:16.063534 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.063511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg9jl\" (UniqueName: \"kubernetes.io/projected/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-kube-api-access-fg9jl\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:16.155911 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.155861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmv8\" (UniqueName: \"kubernetes.io/projected/c336d09a-931b-409b-a010-6bf7cb87a9a9-kube-api-access-8bmv8\") pod \"volume-data-source-validator-7c6cbb6c87-r5k85\" (UID: \"c336d09a-931b-409b-a010-6bf7cb87a9a9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85" Apr 17 11:18:16.164240 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.164212 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmv8\" (UniqueName: \"kubernetes.io/projected/c336d09a-931b-409b-a010-6bf7cb87a9a9-kube-api-access-8bmv8\") pod \"volume-data-source-validator-7c6cbb6c87-r5k85\" (UID: \"c336d09a-931b-409b-a010-6bf7cb87a9a9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85" Apr 17 11:18:16.181995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.181970 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85" Apr 17 11:18:16.289491 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.289433 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85"] Apr 17 11:18:16.293799 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:16.293770 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc336d09a_931b_409b_a010_6bf7cb87a9a9.slice/crio-1d01755ff9374c5169182fd08d57a2bb7d71e66a25a01fef969cf946a46dfb2a WatchSource:0}: Error finding container 1d01755ff9374c5169182fd08d57a2bb7d71e66a25a01fef969cf946a46dfb2a: Status 404 returned error can't find the container with id 1d01755ff9374c5169182fd08d57a2bb7d71e66a25a01fef969cf946a46dfb2a Apr 17 11:18:16.559182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.559149 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:16.559368 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:16.559283 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:16.559368 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:16.559342 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls podName:a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:17.55932544 +0000 UTC m=+127.482180241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rrkhp" (UID: "a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f") : secret "samples-operator-tls" not found Apr 17 11:18:16.704871 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.704837 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-22ddd"] Apr 17 11:18:16.709890 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.709875 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.712715 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.712694 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:16.712835 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.712765 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 11:18:16.712835 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.712777 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-czd74\"" Apr 17 11:18:16.713555 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.713531 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:16.713904 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.713890 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 11:18:16.719258 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.719239 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 11:18:16.719873 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.719855 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-22ddd"] Apr 17 11:18:16.861901 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.861824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/134e0312-09a3-4d5f-b641-3d6579587cde-trusted-ca\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.861901 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.861887 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/134e0312-09a3-4d5f-b641-3d6579587cde-config\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.862334 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.862025 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ndg\" (UniqueName: \"kubernetes.io/projected/134e0312-09a3-4d5f-b641-3d6579587cde-kube-api-access-d7ndg\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.862334 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.862130 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/134e0312-09a3-4d5f-b641-3d6579587cde-serving-cert\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.963197 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.963170 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/134e0312-09a3-4d5f-b641-3d6579587cde-serving-cert\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.963336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.963217 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/134e0312-09a3-4d5f-b641-3d6579587cde-trusted-ca\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.963336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.963293 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/134e0312-09a3-4d5f-b641-3d6579587cde-config\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.963430 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.963351 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ndg\" (UniqueName: \"kubernetes.io/projected/134e0312-09a3-4d5f-b641-3d6579587cde-kube-api-access-d7ndg\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.964003 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.963977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/134e0312-09a3-4d5f-b641-3d6579587cde-config\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.964117 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.964080 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/134e0312-09a3-4d5f-b641-3d6579587cde-trusted-ca\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.965360 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.965342 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/134e0312-09a3-4d5f-b641-3d6579587cde-serving-cert\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:16.975849 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:16.975829 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ndg\" (UniqueName: \"kubernetes.io/projected/134e0312-09a3-4d5f-b641-3d6579587cde-kube-api-access-d7ndg\") pod \"console-operator-9d4b6777b-22ddd\" (UID: \"134e0312-09a3-4d5f-b641-3d6579587cde\") " pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:17.018868 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:17.018842 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:17.074820 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:17.074749 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85" event={"ID":"c336d09a-931b-409b-a010-6bf7cb87a9a9","Type":"ContainerStarted","Data":"1d01755ff9374c5169182fd08d57a2bb7d71e66a25a01fef969cf946a46dfb2a"} Apr 17 11:18:17.146719 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:17.146639 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-22ddd"] Apr 17 11:18:17.149986 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:17.149958 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod134e0312_09a3_4d5f_b641_3d6579587cde.slice/crio-cd17263989ed372828541c4bc8613f0c74cab99c2208028ff4060c8594ed0756 WatchSource:0}: Error finding container cd17263989ed372828541c4bc8613f0c74cab99c2208028ff4060c8594ed0756: Status 404 returned error can't find the container with id cd17263989ed372828541c4bc8613f0c74cab99c2208028ff4060c8594ed0756 Apr 17 11:18:17.567974 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:17.567940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:17.568110 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:17.568081 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:17.568188 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:17.568146 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls podName:a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:19.568132232 +0000 UTC m=+129.490987027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rrkhp" (UID: "a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f") : secret "samples-operator-tls" not found Apr 17 11:18:18.077530 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:18.077486 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" event={"ID":"134e0312-09a3-4d5f-b641-3d6579587cde","Type":"ContainerStarted","Data":"cd17263989ed372828541c4bc8613f0c74cab99c2208028ff4060c8594ed0756"} Apr 17 11:18:18.079019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:18.078992 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85" event={"ID":"c336d09a-931b-409b-a010-6bf7cb87a9a9","Type":"ContainerStarted","Data":"e45873ae43b82e0ab10d8b1be1e9ada4f9d918ecbad8b138c03b177bcf631302"} Apr 17 11:18:18.096418 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:18.096361 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-r5k85" podStartSLOduration=1.819656459 podStartE2EDuration="3.096346746s" podCreationTimestamp="2026-04-17 11:18:15 +0000 UTC" firstStartedPulling="2026-04-17 11:18:16.295554666 +0000 UTC m=+126.218409462" lastFinishedPulling="2026-04-17 11:18:17.572244949 +0000 UTC m=+127.495099749" observedRunningTime="2026-04-17 11:18:18.095726452 +0000 UTC m=+128.018581271" watchObservedRunningTime="2026-04-17 11:18:18.096346746 +0000 UTC m=+128.019201563" Apr 17 11:18:19.082042 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:19.082018 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/0.log" Apr 17 11:18:19.082340 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:19.082055 2568 generic.go:358] "Generic (PLEG): container finished" podID="134e0312-09a3-4d5f-b641-3d6579587cde" containerID="8b2bc9c6d31a54672f22728acb434fbda4f2a5bc08942894f7e2d19ddf85c4a1" exitCode=255 Apr 17 11:18:19.082340 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:19.082093 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" event={"ID":"134e0312-09a3-4d5f-b641-3d6579587cde","Type":"ContainerDied","Data":"8b2bc9c6d31a54672f22728acb434fbda4f2a5bc08942894f7e2d19ddf85c4a1"} Apr 17 11:18:19.082340 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:19.082301 2568 scope.go:117] "RemoveContainer" containerID="8b2bc9c6d31a54672f22728acb434fbda4f2a5bc08942894f7e2d19ddf85c4a1" Apr 17 11:18:19.582870 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:19.582829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:19.583028 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:19.582964 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:19.583069 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:19.583028 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls podName:a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:23.583014043 +0000 UTC m=+133.505868843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rrkhp" (UID: "a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f") : secret "samples-operator-tls" not found Apr 17 11:18:20.086175 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:20.086150 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/1.log" Apr 17 11:18:20.086549 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:20.086524 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/0.log" Apr 17 11:18:20.086607 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:20.086556 2568 generic.go:358] "Generic (PLEG): container finished" podID="134e0312-09a3-4d5f-b641-3d6579587cde" containerID="e0de1a1446fd101c1be3521e9278ee2da1064d615864fec1e15d2a82e5c3e099" exitCode=255 Apr 17 11:18:20.086607 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:20.086593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" event={"ID":"134e0312-09a3-4d5f-b641-3d6579587cde","Type":"ContainerDied","Data":"e0de1a1446fd101c1be3521e9278ee2da1064d615864fec1e15d2a82e5c3e099"} Apr 17 11:18:20.086702 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:20.086646 2568 scope.go:117] "RemoveContainer" containerID="8b2bc9c6d31a54672f22728acb434fbda4f2a5bc08942894f7e2d19ddf85c4a1" Apr 17 11:18:20.086823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:20.086807 2568 scope.go:117] "RemoveContainer" containerID="e0de1a1446fd101c1be3521e9278ee2da1064d615864fec1e15d2a82e5c3e099" Apr 17 11:18:20.087005 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:20.086988 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-22ddd_openshift-console-operator(134e0312-09a3-4d5f-b641-3d6579587cde)\"" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" podUID="134e0312-09a3-4d5f-b641-3d6579587cde" Apr 17 11:18:20.387755 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:20.387672 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:18:20.387893 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:20.387813 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:20.387893 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:20.387882 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs podName:d84dc363-0ebb-4e0c-9b94-1024f80ccbb3 nodeName:}" failed. No retries permitted until 2026-04-17 11:20:22.387864248 +0000 UTC m=+252.310719058 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs") pod "network-metrics-daemon-4d5rw" (UID: "d84dc363-0ebb-4e0c-9b94-1024f80ccbb3") : secret "metrics-daemon-secret" not found Apr 17 11:18:21.093563 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:21.093533 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/1.log" Apr 17 11:18:21.093924 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:21.093839 2568 scope.go:117] "RemoveContainer" containerID="e0de1a1446fd101c1be3521e9278ee2da1064d615864fec1e15d2a82e5c3e099" Apr 17 11:18:21.093999 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:21.093982 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-22ddd_openshift-console-operator(134e0312-09a3-4d5f-b641-3d6579587cde)\"" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" podUID="134e0312-09a3-4d5f-b641-3d6579587cde" Apr 17 11:18:23.611342 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:23.611298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:23.611725 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:23.611467 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:23.611725 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:23.611530 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls podName:a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:31.611513404 +0000 UTC m=+141.534368205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rrkhp" (UID: "a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f") : secret "samples-operator-tls" not found Apr 17 11:18:23.913895 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:23.913813 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:23.914017 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:23.913945 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:23.914017 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:23.914011 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls podName:c3bbc74d-0a7a-4056-af5e-f1e1491bfed5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:39.913994888 +0000 UTC m=+149.836849690 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2wsvr" (UID: "c3bbc74d-0a7a-4056-af5e-f1e1491bfed5") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:24.018879 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.018850 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jtzjd"] Apr 17 11:18:24.022847 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.022834 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.025336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.025318 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 11:18:24.025446 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.025318 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 11:18:24.025499 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.025458 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 11:18:24.026602 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.026586 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 11:18:24.026686 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.026618 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bsmxk\"" Apr 17 11:18:24.029253 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.029009 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jtzjd"] Apr 17 11:18:24.115615 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.115574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6xf\" (UniqueName: \"kubernetes.io/projected/13377be9-2206-4fad-811d-9b6d348c8317-kube-api-access-9r6xf\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.115744 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.115632 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13377be9-2206-4fad-811d-9b6d348c8317-signing-key\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.115744 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.115707 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13377be9-2206-4fad-811d-9b6d348c8317-signing-cabundle\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.216625 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.216560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13377be9-2206-4fad-811d-9b6d348c8317-signing-cabundle\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.216720 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.216706 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6xf\" (UniqueName: \"kubernetes.io/projected/13377be9-2206-4fad-811d-9b6d348c8317-kube-api-access-9r6xf\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.216756 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.216739 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13377be9-2206-4fad-811d-9b6d348c8317-signing-key\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.217330 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.217303 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13377be9-2206-4fad-811d-9b6d348c8317-signing-cabundle\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.219043 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.219021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13377be9-2206-4fad-811d-9b6d348c8317-signing-key\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.225041 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.225022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6xf\" (UniqueName: \"kubernetes.io/projected/13377be9-2206-4fad-811d-9b6d348c8317-kube-api-access-9r6xf\") pod \"service-ca-865cb79987-jtzjd\" (UID: \"13377be9-2206-4fad-811d-9b6d348c8317\") " pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.332168 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.332138 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jtzjd" Apr 17 11:18:24.441931 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:24.441899 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jtzjd"] Apr 17 11:18:24.444821 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:24.444793 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13377be9_2206_4fad_811d_9b6d348c8317.slice/crio-e7ec288217826f4efb4af0101e810024e1beefd157acc09c344b6f3bd08c9439 WatchSource:0}: Error finding container e7ec288217826f4efb4af0101e810024e1beefd157acc09c344b6f3bd08c9439: Status 404 returned error can't find the container with id e7ec288217826f4efb4af0101e810024e1beefd157acc09c344b6f3bd08c9439 Apr 17 11:18:25.102733 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:25.102691 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jtzjd" event={"ID":"13377be9-2206-4fad-811d-9b6d348c8317","Type":"ContainerStarted","Data":"e7ec288217826f4efb4af0101e810024e1beefd157acc09c344b6f3bd08c9439"} Apr 17 11:18:27.019647 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:27.019612 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:27.019647 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:27.019647 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:27.020008 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:27.019964 2568 scope.go:117] "RemoveContainer" containerID="e0de1a1446fd101c1be3521e9278ee2da1064d615864fec1e15d2a82e5c3e099" Apr 17 11:18:27.020130 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:27.020114 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-22ddd_openshift-console-operator(134e0312-09a3-4d5f-b641-3d6579587cde)\"" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" podUID="134e0312-09a3-4d5f-b641-3d6579587cde" Apr 17 11:18:28.109814 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:28.109776 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jtzjd" event={"ID":"13377be9-2206-4fad-811d-9b6d348c8317","Type":"ContainerStarted","Data":"22051fe1db05523cd9e4ee05cb83852bbf12da266ec6bc23f643e51c7396a8b3"} Apr 17 11:18:28.132059 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:28.132013 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jtzjd" podStartSLOduration=0.830960013 podStartE2EDuration="4.13199857s" podCreationTimestamp="2026-04-17 11:18:24 +0000 UTC" firstStartedPulling="2026-04-17 11:18:24.446491938 +0000 UTC m=+134.369346733" lastFinishedPulling="2026-04-17 11:18:27.747530491 +0000 UTC m=+137.670385290" observedRunningTime="2026-04-17 11:18:28.130007776 +0000 UTC m=+138.052862595" watchObservedRunningTime="2026-04-17 11:18:28.13199857 +0000 UTC m=+138.054853389" Apr 17 11:18:31.676061 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:31.676024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:31.678434 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:31.678411 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rrkhp\" (UID: \"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:31.974492 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:31.974406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" Apr 17 11:18:32.086122 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:32.086090 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp"] Apr 17 11:18:33.124401 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:33.124350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" event={"ID":"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f","Type":"ContainerStarted","Data":"68074f9e70cee43f12b038614c483341736a859c746c9d7c9e320f6c5ddfe7da"} Apr 17 11:18:34.127924 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:34.127890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" event={"ID":"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f","Type":"ContainerStarted","Data":"4dc759ae67c93cd14f75c748bf4636adea926608387479cceb1f8d04d0ffd9ee"} Apr 17 11:18:34.127924 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:34.127928 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" event={"ID":"a3a0bce0-8841-4fcf-9b5d-36b6ebf0250f","Type":"ContainerStarted","Data":"4415289b370e5a9ede14d35cb750c3d8bde79aabfc2e8710fae120bffab5ac9b"} Apr 17 11:18:34.145094 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:34.145055 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rrkhp" podStartSLOduration=17.566458668 podStartE2EDuration="19.145041183s" podCreationTimestamp="2026-04-17 11:18:15 +0000 UTC" firstStartedPulling="2026-04-17 11:18:32.1367777 +0000 UTC m=+142.059632495" lastFinishedPulling="2026-04-17 11:18:33.715360214 +0000 UTC m=+143.638215010" observedRunningTime="2026-04-17 11:18:34.144221411 +0000 UTC m=+144.067076229" watchObservedRunningTime="2026-04-17 11:18:34.145041183 +0000 UTC m=+144.067896001" Apr 17 11:18:39.943154 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:39.943102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:39.945399 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:39.945366 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3bbc74d-0a7a-4056-af5e-f1e1491bfed5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2wsvr\" (UID: \"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:40.136969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:40.136942 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-v296b\"" Apr 17 11:18:40.144770 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:40.144747 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" Apr 17 11:18:40.253726 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:40.253693 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr"] Apr 17 11:18:40.717442 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:40.717415 2568 scope.go:117] "RemoveContainer" containerID="e0de1a1446fd101c1be3521e9278ee2da1064d615864fec1e15d2a82e5c3e099" Apr 17 11:18:41.145936 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:41.145892 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" event={"ID":"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5","Type":"ContainerStarted","Data":"0d12272ce61100aebbeb105079377db77f782c5b4ea91c6f9b13c8555d5e13b3"} Apr 17 11:18:41.147331 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:41.147304 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/2.log" Apr 17 11:18:41.147754 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:41.147730 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/1.log" Apr 17 11:18:41.147864 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:41.147773 2568 generic.go:358] "Generic (PLEG): container finished" podID="134e0312-09a3-4d5f-b641-3d6579587cde" containerID="e6734be68571b1e8dfc55691345eff37d962204b183ae685ea70194ff8b1fc2d" exitCode=255 Apr 17 11:18:41.147864 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:41.147810 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" event={"ID":"134e0312-09a3-4d5f-b641-3d6579587cde","Type":"ContainerDied","Data":"e6734be68571b1e8dfc55691345eff37d962204b183ae685ea70194ff8b1fc2d"} Apr 17 11:18:41.147864 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:41.147836 2568 scope.go:117] "RemoveContainer" containerID="e0de1a1446fd101c1be3521e9278ee2da1064d615864fec1e15d2a82e5c3e099" Apr 17 11:18:41.148201 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:41.148183 2568 scope.go:117] "RemoveContainer" containerID="e6734be68571b1e8dfc55691345eff37d962204b183ae685ea70194ff8b1fc2d" Apr 17 11:18:41.148448 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:41.148424 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-22ddd_openshift-console-operator(134e0312-09a3-4d5f-b641-3d6579587cde)\"" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" podUID="134e0312-09a3-4d5f-b641-3d6579587cde" Apr 17 11:18:42.151798 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:42.151764 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" event={"ID":"c3bbc74d-0a7a-4056-af5e-f1e1491bfed5","Type":"ContainerStarted","Data":"779d28be66bfb48255ca48d5136c48be3deae2062cf940d08301f546f7cd340c"} Apr 17 11:18:42.153064 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:42.153046 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/2.log" Apr 17 11:18:42.168831 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:42.168785 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2wsvr" podStartSLOduration=32.668121701 podStartE2EDuration="34.168771039s" podCreationTimestamp="2026-04-17 11:18:08 +0000 UTC" firstStartedPulling="2026-04-17 11:18:40.260588393 +0000 UTC m=+150.183443194" lastFinishedPulling="2026-04-17 11:18:41.761237733 +0000 UTC m=+151.684092532" observedRunningTime="2026-04-17 11:18:42.16779576 +0000 UTC m=+152.090650575" watchObservedRunningTime="2026-04-17 11:18:42.168771039 +0000 UTC m=+152.091625926" Apr 17 11:18:43.552576 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.552543 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl"] Apr 17 11:18:43.555534 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.555514 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hx7b7"] Apr 17 11:18:43.555678 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.555661 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" Apr 17 11:18:43.558271 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.558248 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 11:18:43.558404 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.558251 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 11:18:43.558588 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.558575 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.559329 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.559308 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rhgln\"" Apr 17 11:18:43.560749 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.560728 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:18:43.561049 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.561019 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-64q7x\"" Apr 17 11:18:43.561136 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.561062 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:18:43.564924 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.564903 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl"] Apr 17 11:18:43.567879 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.567860 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hx7b7"] Apr 17 11:18:43.664204 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.664179 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d"] Apr 17 11:18:43.667189 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.667174 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" Apr 17 11:18:43.669877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.669858 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-gk7tr\"" Apr 17 11:18:43.670000 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.669923 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 11:18:43.671783 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.671755 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/50ebaa9d-a374-4732-b57c-5cfb2b64a318-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gv5jl\" (UID: \"50ebaa9d-a374-4732-b57c-5cfb2b64a318\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" Apr 17 11:18:43.671878 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.671791 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.671878 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.671868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/50ebaa9d-a374-4732-b57c-5cfb2b64a318-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gv5jl\" (UID: \"50ebaa9d-a374-4732-b57c-5cfb2b64a318\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" Apr 17 11:18:43.671955 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.671908 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnbnw\" (UniqueName: \"kubernetes.io/projected/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-kube-api-access-bnbnw\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.671955 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.671938 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.672035 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.672019 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-data-volume\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.672139 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.672122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-crio-socket\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.682283 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.682261 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d"] Apr 17 11:18:43.772869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.772843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-crio-socket\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.772966 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.772894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/50ebaa9d-a374-4732-b57c-5cfb2b64a318-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gv5jl\" (UID: \"50ebaa9d-a374-4732-b57c-5cfb2b64a318\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" Apr 17 11:18:43.772966 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.772922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.773060 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.772965 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-crio-socket\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.773060 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.772964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/50ebaa9d-a374-4732-b57c-5cfb2b64a318-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gv5jl\" (UID: \"50ebaa9d-a374-4732-b57c-5cfb2b64a318\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" Apr 17 11:18:43.773060 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.773013 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnbnw\" (UniqueName: \"kubernetes.io/projected/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-kube-api-access-bnbnw\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.773060 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.773040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.773239 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.773066 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c617e6a0-6c4b-443d-baf3-c104f0de1db9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qc28d\" (UID: \"c617e6a0-6c4b-443d-baf3-c104f0de1db9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" Apr 17 11:18:43.773239 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.773107 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-data-volume\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.773482 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.773458 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-data-volume\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.773705 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.773685 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.773784 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.773769 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/50ebaa9d-a374-4732-b57c-5cfb2b64a318-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gv5jl\" (UID: \"50ebaa9d-a374-4732-b57c-5cfb2b64a318\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" Apr 17 11:18:43.775244 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.775227 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.775432 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.775416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/50ebaa9d-a374-4732-b57c-5cfb2b64a318-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gv5jl\" (UID: \"50ebaa9d-a374-4732-b57c-5cfb2b64a318\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" Apr 17 11:18:43.795011 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.794984 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnbnw\" (UniqueName: \"kubernetes.io/projected/46e1bfc0-432d-4c42-9d39-5eba5a87ea6a-kube-api-access-bnbnw\") pod \"insights-runtime-extractor-hx7b7\" (UID: \"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a\") " pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.865965 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.865900 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" Apr 17 11:18:43.871556 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.871528 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hx7b7" Apr 17 11:18:43.874262 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.874243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c617e6a0-6c4b-443d-baf3-c104f0de1db9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qc28d\" (UID: \"c617e6a0-6c4b-443d-baf3-c104f0de1db9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" Apr 17 11:18:43.876650 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.876614 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c617e6a0-6c4b-443d-baf3-c104f0de1db9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qc28d\" (UID: \"c617e6a0-6c4b-443d-baf3-c104f0de1db9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" Apr 17 11:18:43.975512 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:43.975305 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" Apr 17 11:18:44.011919 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:44.011838 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl"] Apr 17 11:18:44.015031 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:44.014990 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50ebaa9d_a374_4732_b57c_5cfb2b64a318.slice/crio-743c5521abf4c3a3ede4d868b9fc7f79a4a4b8fcdd0723f67575b0c5b0a412f0 WatchSource:0}: Error finding container 743c5521abf4c3a3ede4d868b9fc7f79a4a4b8fcdd0723f67575b0c5b0a412f0: Status 404 returned error can't find the container with id 743c5521abf4c3a3ede4d868b9fc7f79a4a4b8fcdd0723f67575b0c5b0a412f0 Apr 17 11:18:44.032528 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:44.032482 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hx7b7"] Apr 17 11:18:44.037900 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:44.037871 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46e1bfc0_432d_4c42_9d39_5eba5a87ea6a.slice/crio-13f96b1f3d1e57b2770494c97c0bbe86d665d6ddc45c3a8db34399670103791e WatchSource:0}: Error finding container 13f96b1f3d1e57b2770494c97c0bbe86d665d6ddc45c3a8db34399670103791e: Status 404 returned error can't find the container with id 13f96b1f3d1e57b2770494c97c0bbe86d665d6ddc45c3a8db34399670103791e Apr 17 11:18:44.100555 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:44.100514 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d"] Apr 17 11:18:44.103194 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:44.103169 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc617e6a0_6c4b_443d_baf3_c104f0de1db9.slice/crio-70d1daeed659284b8557fddea926158f4f70d24384e829f3462c6c2b0462d4a5 WatchSource:0}: Error finding container 70d1daeed659284b8557fddea926158f4f70d24384e829f3462c6c2b0462d4a5: Status 404 returned error can't find the container with id 70d1daeed659284b8557fddea926158f4f70d24384e829f3462c6c2b0462d4a5 Apr 17 11:18:44.158859 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:44.158787 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" event={"ID":"c617e6a0-6c4b-443d-baf3-c104f0de1db9","Type":"ContainerStarted","Data":"70d1daeed659284b8557fddea926158f4f70d24384e829f3462c6c2b0462d4a5"} Apr 17 11:18:44.160045 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:44.160018 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hx7b7" event={"ID":"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a","Type":"ContainerStarted","Data":"78419b85d18082dbd6c4b4f79584a2a316f80700c0c052ead76f66e320714669"} Apr 17 11:18:44.160150 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:44.160053 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hx7b7" event={"ID":"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a","Type":"ContainerStarted","Data":"13f96b1f3d1e57b2770494c97c0bbe86d665d6ddc45c3a8db34399670103791e"} Apr 17 11:18:44.160940 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:44.160922 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" event={"ID":"50ebaa9d-a374-4732-b57c-5cfb2b64a318","Type":"ContainerStarted","Data":"743c5521abf4c3a3ede4d868b9fc7f79a4a4b8fcdd0723f67575b0c5b0a412f0"} Apr 17 11:18:45.165835 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:45.165795 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hx7b7" event={"ID":"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a","Type":"ContainerStarted","Data":"c510fd44363942116468a9e0d43e2dd11dcd49b91c5f763e3d8805ec892076a8"} Apr 17 11:18:45.979822 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:45.979775 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" podUID="16f785fc-52f1-4f27-a4e9-f56d09ae67b2" Apr 17 11:18:45.994719 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:45.994681 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hst8c" podUID="9443878d-c2b0-4771-b41e-f23e0fff86a4" Apr 17 11:18:46.027564 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:46.027526 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8jgxz" podUID="015be0d7-ff4e-4b65-b3ee-73d579ba395e" Apr 17 11:18:46.169737 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:46.169709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" event={"ID":"c617e6a0-6c4b-443d-baf3-c104f0de1db9","Type":"ContainerStarted","Data":"1ca63f3be16d3e1fa721cc74b1ac34bcbf4fdf96bf1bf5e97478de6194617174"} Apr 17 11:18:46.170078 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:46.169936 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" Apr 17 11:18:46.171177 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:46.171155 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hst8c" Apr 17 11:18:46.171177 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:46.171171 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" event={"ID":"50ebaa9d-a374-4732-b57c-5cfb2b64a318","Type":"ContainerStarted","Data":"0c014300cb2fa49b7dd4b7027d01605018f57b673facf13c33ba520fcb8eb139"} Apr 17 11:18:46.175090 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:46.175053 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" Apr 17 11:18:46.204551 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:46.204513 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qc28d" podStartSLOduration=1.935336338 podStartE2EDuration="3.204502185s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:44.10497206 +0000 UTC m=+154.027826856" lastFinishedPulling="2026-04-17 11:18:45.374137892 +0000 UTC m=+155.296992703" observedRunningTime="2026-04-17 11:18:46.185165 +0000 UTC m=+156.108019818" watchObservedRunningTime="2026-04-17 11:18:46.204502185 +0000 UTC m=+156.127356997" Apr 17 11:18:46.205036 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:46.205012 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gv5jl" podStartSLOduration=1.847132608 podStartE2EDuration="3.205006745s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:44.016969105 +0000 UTC m=+153.939823900" lastFinishedPulling="2026-04-17 11:18:45.374843239 +0000 UTC m=+155.297698037" observedRunningTime="2026-04-17 11:18:46.203585006 +0000 UTC m=+156.126439824" watchObservedRunningTime="2026-04-17 11:18:46.205006745 +0000 UTC m=+156.127861563" Apr 17 11:18:47.019195 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.019164 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:47.019195 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.019196 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:18:47.019502 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.019490 2568 scope.go:117] "RemoveContainer" containerID="e6734be68571b1e8dfc55691345eff37d962204b183ae685ea70194ff8b1fc2d" Apr 17 11:18:47.019659 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:47.019643 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-22ddd_openshift-console-operator(134e0312-09a3-4d5f-b641-3d6579587cde)\"" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" podUID="134e0312-09a3-4d5f-b641-3d6579587cde" Apr 17 11:18:47.175494 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.175455 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hx7b7" event={"ID":"46e1bfc0-432d-4c42-9d39-5eba5a87ea6a","Type":"ContainerStarted","Data":"fd259b84f56b9dfb52bd04855c3c091ac923b19556dddbd9e88550852da63853"} Apr 17 11:18:47.198297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.198253 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hx7b7" podStartSLOduration=2.150583182 podStartE2EDuration="4.198240118s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:44.098966684 +0000 UTC m=+154.021821480" lastFinishedPulling="2026-04-17 11:18:46.146623617 +0000 UTC m=+156.069478416" observedRunningTime="2026-04-17 11:18:47.196583784 +0000 UTC m=+157.119438621" watchObservedRunningTime="2026-04-17 11:18:47.198240118 +0000 UTC m=+157.121094933" Apr 17 11:18:47.295160 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.295130 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-r4hbg"] Apr 17 11:18:47.298253 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.298236 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.300775 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.300753 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 11:18:47.300869 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.300798 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 11:18:47.300939 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.300922 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-58mbc\"" Apr 17 11:18:47.301048 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.301033 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:18:47.306573 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.306551 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-r4hbg"] Apr 17 11:18:47.399509 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.399477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.399638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.399522 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.399638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.399554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhd5p\" (UniqueName: \"kubernetes.io/projected/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-kube-api-access-dhd5p\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.399638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.399608 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.500080 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.500049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.500224 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.500087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhd5p\" (UniqueName: \"kubernetes.io/projected/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-kube-api-access-dhd5p\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.500224 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.500168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.500332 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.500237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.500332 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:47.500320 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 11:18:47.500451 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:47.500379 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-tls podName:ec5a98f4-eb7f-4316-8786-d7bfaf42593e nodeName:}" failed. No retries permitted until 2026-04-17 11:18:48.000363589 +0000 UTC m=+157.923218385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-r4hbg" (UID: "ec5a98f4-eb7f-4316-8786-d7bfaf42593e") : secret "prometheus-operator-tls" not found Apr 17 11:18:47.500805 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.500785 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.502558 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.502542 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.509234 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:47.509203 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhd5p\" (UniqueName: \"kubernetes.io/projected/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-kube-api-access-dhd5p\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:47.734923 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:47.734839 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4d5rw" podUID="d84dc363-0ebb-4e0c-9b94-1024f80ccbb3" Apr 17 11:18:48.004307 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:48.004232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:48.006644 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:48.006613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec5a98f4-eb7f-4316-8786-d7bfaf42593e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-r4hbg\" (UID: \"ec5a98f4-eb7f-4316-8786-d7bfaf42593e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:48.207982 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:48.207953 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" Apr 17 11:18:48.321182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:48.321070 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-r4hbg"] Apr 17 11:18:48.323554 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:48.323529 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec5a98f4_eb7f_4316_8786_d7bfaf42593e.slice/crio-913d7c85829281ddbc1849f981e03c6d9934c1b803335c721bbb91fd4afdf1a6 WatchSource:0}: Error finding container 913d7c85829281ddbc1849f981e03c6d9934c1b803335c721bbb91fd4afdf1a6: Status 404 returned error can't find the container with id 913d7c85829281ddbc1849f981e03c6d9934c1b803335c721bbb91fd4afdf1a6 Apr 17 11:18:49.186693 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:49.186651 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" event={"ID":"ec5a98f4-eb7f-4316-8786-d7bfaf42593e","Type":"ContainerStarted","Data":"913d7c85829281ddbc1849f981e03c6d9934c1b803335c721bbb91fd4afdf1a6"} Apr 17 11:18:50.190544 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.190510 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" event={"ID":"ec5a98f4-eb7f-4316-8786-d7bfaf42593e","Type":"ContainerStarted","Data":"9f4695ae05b9508502a408c5a4f3909229e9a1f4d63eca8fbfe1e4d4031f7fbc"} Apr 17 11:18:50.190544 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.190547 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" event={"ID":"ec5a98f4-eb7f-4316-8786-d7bfaf42593e","Type":"ContainerStarted","Data":"18753f202c21a6908fc94387d5f9faac2488f758022ef34af59cdd1cb8eb7ca3"} Apr 17 11:18:50.225450 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.225405 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-r4hbg" podStartSLOduration=1.981357326 podStartE2EDuration="3.225376181s" podCreationTimestamp="2026-04-17 11:18:47 +0000 UTC" firstStartedPulling="2026-04-17 11:18:48.325270159 +0000 UTC m=+158.248124955" lastFinishedPulling="2026-04-17 11:18:49.569289014 +0000 UTC m=+159.492143810" observedRunningTime="2026-04-17 11:18:50.223965916 +0000 UTC m=+160.146820734" watchObservedRunningTime="2026-04-17 11:18:50.225376181 +0000 UTC m=+160.148230998" Apr 17 11:18:50.828353 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.828318 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:18:50.828353 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.828359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:18:50.830596 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.830572 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9443878d-c2b0-4771-b41e-f23e0fff86a4-metrics-tls\") pod \"dns-default-hst8c\" (UID: \"9443878d-c2b0-4771-b41e-f23e0fff86a4\") " pod="openshift-dns/dns-default-hst8c" Apr 17 11:18:50.830697 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.830672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"image-registry-799f5b7986-bv6sb\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:18:50.928684 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.928660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:18:50.930862 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.930840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/015be0d7-ff4e-4b65-b3ee-73d579ba395e-cert\") pod \"ingress-canary-8jgxz\" (UID: \"015be0d7-ff4e-4b65-b3ee-73d579ba395e\") " pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:18:50.974730 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.974702 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bd2vh\"" Apr 17 11:18:50.982984 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:50.982965 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hst8c" Apr 17 11:18:51.093940 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.093873 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hst8c"] Apr 17 11:18:51.096724 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:51.096691 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9443878d_c2b0_4771_b41e_f23e0fff86a4.slice/crio-69dd5ef8ed8782feca9e564794f7c0085aafe69fe3bb64fed8efcad4c48a048b WatchSource:0}: Error finding container 69dd5ef8ed8782feca9e564794f7c0085aafe69fe3bb64fed8efcad4c48a048b: Status 404 returned error can't find the container with id 69dd5ef8ed8782feca9e564794f7c0085aafe69fe3bb64fed8efcad4c48a048b Apr 17 11:18:51.194422 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.194368 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hst8c" event={"ID":"9443878d-c2b0-4771-b41e-f23e0fff86a4","Type":"ContainerStarted","Data":"69dd5ef8ed8782feca9e564794f7c0085aafe69fe3bb64fed8efcad4c48a048b"} Apr 17 11:18:51.644675 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.644632 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg"] Apr 17 11:18:51.648478 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.648451 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.651444 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.651282 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:18:51.651444 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.651328 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 11:18:51.651741 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.651706 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-n8qx9\"" Apr 17 11:18:51.656767 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.656630 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg"] Apr 17 11:18:51.661913 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.661893 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nqq2j"] Apr 17 11:18:51.665589 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.665570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.668454 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.668434 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-n4xtt\"" Apr 17 11:18:51.668454 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.668447 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:18:51.668605 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.668593 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 11:18:51.668785 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.668763 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 11:18:51.680614 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.680589 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nqq2j"] Apr 17 11:18:51.691317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.691290 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7cq4d"] Apr 17 11:18:51.695091 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.695070 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.697964 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.697581 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:18:51.697964 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.697814 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2bm54\"" Apr 17 11:18:51.697964 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.697843 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:18:51.697964 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.697886 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:18:51.736340 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f9da7be-ba93-41a4-890a-451226a11e8f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.736486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736361 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/78e0eced-4738-41b6-84e2-8c3d5dc008d8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.736486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736405 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pgmz\" (UniqueName: \"kubernetes.io/projected/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-api-access-6pgmz\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.736486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736435 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.736486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736470 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.736762 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736572 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.736762 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736602 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc2ct\" (UniqueName: \"kubernetes.io/projected/0f9da7be-ba93-41a4-890a-451226a11e8f-kube-api-access-cc2ct\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.736762 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736629 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78e0eced-4738-41b6-84e2-8c3d5dc008d8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.736762 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736679 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.736762 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.736704 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.837913 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.837877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.838053 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.837929 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838053 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.837962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc2ct\" (UniqueName: \"kubernetes.io/projected/0f9da7be-ba93-41a4-890a-451226a11e8f-kube-api-access-cc2ct\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.838053 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838022 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-accelerators-collector-config\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838207 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78e0eced-4738-41b6-84e2-8c3d5dc008d8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.838207 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838125 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-metrics-client-ca\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838207 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838179 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.838341 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.838341 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-tls\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838341 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f9da7be-ba93-41a4-890a-451226a11e8f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.838341 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/78e0eced-4738-41b6-84e2-8c3d5dc008d8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.838604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pgmz\" (UniqueName: \"kubernetes.io/projected/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-api-access-6pgmz\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.838604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-sys\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838425 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.838604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838464 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjk5\" (UniqueName: \"kubernetes.io/projected/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-kube-api-access-rcjk5\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838496 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.838604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838528 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-textfile\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-root\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-wtmp\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.838970 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838683 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.838970 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.838754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/78e0eced-4738-41b6-84e2-8c3d5dc008d8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.838970 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:51.838942 2568 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 11:18:51.839115 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:51.839001 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-tls podName:0f9da7be-ba93-41a4-890a-451226a11e8f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:52.33898302 +0000 UTC m=+162.261837826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-5q6cg" (UID: "0f9da7be-ba93-41a4-890a-451226a11e8f") : secret "openshift-state-metrics-tls" not found Apr 17 11:18:51.839529 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.839505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78e0eced-4738-41b6-84e2-8c3d5dc008d8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.839969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.839796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f9da7be-ba93-41a4-890a-451226a11e8f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.841143 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.841094 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.842184 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.842159 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.842674 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.842653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.846648 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.846624 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc2ct\" (UniqueName: \"kubernetes.io/projected/0f9da7be-ba93-41a4-890a-451226a11e8f-kube-api-access-cc2ct\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:51.851015 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.850989 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pgmz\" (UniqueName: \"kubernetes.io/projected/78e0eced-4738-41b6-84e2-8c3d5dc008d8-kube-api-access-6pgmz\") pod \"kube-state-metrics-69db897b98-nqq2j\" (UID: \"78e0eced-4738-41b6-84e2-8c3d5dc008d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:51.939486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-sys\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939405 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjk5\" (UniqueName: \"kubernetes.io/projected/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-kube-api-access-rcjk5\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939440 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-textfile\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-sys\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939486 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939469 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-root\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939494 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-wtmp\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-root\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939624 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-wtmp\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-accelerators-collector-config\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939778 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-metrics-client-ca\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939803 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-textfile\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.939845 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.939819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-tls\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.940263 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:51.939959 2568 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:18:51.940263 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:18:51.940027 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-tls podName:2bf7c07f-cc95-4712-aa18-c07b4d35d1a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:52.440007957 +0000 UTC m=+162.362862755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-tls") pod "node-exporter-7cq4d" (UID: "2bf7c07f-cc95-4712-aa18-c07b4d35d1a3") : secret "node-exporter-tls" not found Apr 17 11:18:51.940892 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.940860 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-metrics-client-ca\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.941014 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.940946 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-accelerators-collector-config\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.943612 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.943570 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.948971 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.948939 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjk5\" (UniqueName: \"kubernetes.io/projected/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-kube-api-access-rcjk5\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:51.978146 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:51.977761 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" Apr 17 11:18:52.132436 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.132378 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nqq2j"] Apr 17 11:18:52.343638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.343602 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:52.346377 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.346341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f9da7be-ba93-41a4-890a-451226a11e8f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5q6cg\" (UID: \"0f9da7be-ba93-41a4-890a-451226a11e8f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:52.413779 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:52.413727 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78e0eced_4738_41b6_84e2_8c3d5dc008d8.slice/crio-510be14949e29b5f59446c15ecfb003d953a84bfa4d31927fb8c05c996043d53 WatchSource:0}: Error finding container 510be14949e29b5f59446c15ecfb003d953a84bfa4d31927fb8c05c996043d53: Status 404 returned error can't find the container with id 510be14949e29b5f59446c15ecfb003d953a84bfa4d31927fb8c05c996043d53 Apr 17 11:18:52.445061 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.445030 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-tls\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:52.447259 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.447241 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2bf7c07f-cc95-4712-aa18-c07b4d35d1a3-node-exporter-tls\") pod \"node-exporter-7cq4d\" (UID: \"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3\") " pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:52.561828 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.561807 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" Apr 17 11:18:52.606242 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.606155 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7cq4d" Apr 17 11:18:52.631012 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:52.630977 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf7c07f_cc95_4712_aa18_c07b4d35d1a3.slice/crio-e9fec170a16a5f1b30b978eb380338ef1f1e12e13313c1f8462ebda1fa934dfb WatchSource:0}: Error finding container e9fec170a16a5f1b30b978eb380338ef1f1e12e13313c1f8462ebda1fa934dfb: Status 404 returned error can't find the container with id e9fec170a16a5f1b30b978eb380338ef1f1e12e13313c1f8462ebda1fa934dfb Apr 17 11:18:52.735060 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.735019 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg"] Apr 17 11:18:52.737556 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:52.737520 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f9da7be_ba93_41a4_890a_451226a11e8f.slice/crio-df9d5d2780d56088033b3969daab6de4ec7e8a2aded91d99781700678b656c20 WatchSource:0}: Error finding container df9d5d2780d56088033b3969daab6de4ec7e8a2aded91d99781700678b656c20: Status 404 returned error can't find the container with id df9d5d2780d56088033b3969daab6de4ec7e8a2aded91d99781700678b656c20 Apr 17 11:18:52.792836 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.792802 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:18:52.797752 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.797719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.800588 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.800553 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 11:18:52.800701 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.800621 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 11:18:52.800773 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.800703 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 11:18:52.800903 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.800881 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 11:18:52.801329 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.801310 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 11:18:52.801443 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.801314 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 11:18:52.801443 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.801359 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 11:18:52.801443 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.801370 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 11:18:52.801673 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.801652 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 11:18:52.801960 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.801945 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dfl4n\"" Apr 17 11:18:52.815967 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.815925 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:18:52.952428 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.952584 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.952584 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952474 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-web-config\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.952584 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952530 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2rpd\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-kube-api-access-s2rpd\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.952584 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952561 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.952849 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952593 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.952849 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-out\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.952849 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952687 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.952849 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952816 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-volume\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.953054 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952880 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.953054 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.952912 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.953054 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.953047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:52.953213 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:52.953099 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.053886 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.053849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2rpd\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-kube-api-access-s2rpd\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054065 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.053903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054065 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.053971 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054065 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-out\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054360 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-volume\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054495 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054640 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.054877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.054701 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-web-config\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.055802 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.055746 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.055895 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.055825 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.056035 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.056011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.058200 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.057472 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-out\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.058200 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.057678 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.058200 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.058005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-volume\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.061331 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.059153 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-web-config\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.061331 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.059562 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.061331 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.061054 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.062026 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.061983 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.064819 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.064777 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.065682 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.065394 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2rpd\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-kube-api-access-s2rpd\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.065682 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.065611 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.111963 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.111923 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:18:53.203255 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.202879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hst8c" event={"ID":"9443878d-c2b0-4771-b41e-f23e0fff86a4","Type":"ContainerStarted","Data":"1d37e8de3a9347bf13b1574e2eb31cfad58a366f3fcc3a4a85b3bbebd079c699"} Apr 17 11:18:53.203255 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.202922 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hst8c" event={"ID":"9443878d-c2b0-4771-b41e-f23e0fff86a4","Type":"ContainerStarted","Data":"1d7c4a337473800cbbf64945cbf17e96434fe76a277270c0566e15382cc675e6"} Apr 17 11:18:53.203490 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.203276 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hst8c" Apr 17 11:18:53.205217 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.205186 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" event={"ID":"0f9da7be-ba93-41a4-890a-451226a11e8f","Type":"ContainerStarted","Data":"9ac0bb28fa7d838299004980cec639dad42126067e557947ec517d3499319acb"} Apr 17 11:18:53.205340 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.205223 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" event={"ID":"0f9da7be-ba93-41a4-890a-451226a11e8f","Type":"ContainerStarted","Data":"68c604f12ece5505c45e71c95527a7c3e17371aafbd723e45a5d1b9851c3ebdc"} Apr 17 11:18:53.205340 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.205236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" event={"ID":"0f9da7be-ba93-41a4-890a-451226a11e8f","Type":"ContainerStarted","Data":"df9d5d2780d56088033b3969daab6de4ec7e8a2aded91d99781700678b656c20"} Apr 17 11:18:53.207103 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.207074 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cq4d" event={"ID":"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3","Type":"ContainerStarted","Data":"e9fec170a16a5f1b30b978eb380338ef1f1e12e13313c1f8462ebda1fa934dfb"} Apr 17 11:18:53.209161 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.209130 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" event={"ID":"78e0eced-4738-41b6-84e2-8c3d5dc008d8","Type":"ContainerStarted","Data":"510be14949e29b5f59446c15ecfb003d953a84bfa4d31927fb8c05c996043d53"} Apr 17 11:18:53.226172 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.225675 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hst8c" podStartSLOduration=129.860972951 podStartE2EDuration="2m11.225659095s" podCreationTimestamp="2026-04-17 11:16:42 +0000 UTC" firstStartedPulling="2026-04-17 11:18:51.098456491 +0000 UTC m=+161.021311287" lastFinishedPulling="2026-04-17 11:18:52.463142635 +0000 UTC m=+162.385997431" observedRunningTime="2026-04-17 11:18:53.224729101 +0000 UTC m=+163.147583957" watchObservedRunningTime="2026-04-17 11:18:53.225659095 +0000 UTC m=+163.148513914" Apr 17 11:18:53.274862 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:53.274828 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:18:53.443673 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:53.443592 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f50171b_dc98_42e7_b7b2_e1911230afd4.slice/crio-a1615280fd3338a5bdd16b81640483393bed07e134e6a2f77c12fc52df30f0fa WatchSource:0}: Error finding container a1615280fd3338a5bdd16b81640483393bed07e134e6a2f77c12fc52df30f0fa: Status 404 returned error can't find the container with id a1615280fd3338a5bdd16b81640483393bed07e134e6a2f77c12fc52df30f0fa Apr 17 11:18:54.218173 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:54.217335 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" event={"ID":"0f9da7be-ba93-41a4-890a-451226a11e8f","Type":"ContainerStarted","Data":"c025b59eeb213bacfe2692bca683f3ea2863432f3ac5c01722acd1d18e18c4e2"} Apr 17 11:18:54.220793 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:54.220218 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cq4d" event={"ID":"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3","Type":"ContainerStarted","Data":"0635e532f36733f4b7de2130122b37b64bfc7ecff4f4781d257607fa7430fc45"} Apr 17 11:18:54.224895 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:54.224842 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" event={"ID":"78e0eced-4738-41b6-84e2-8c3d5dc008d8","Type":"ContainerStarted","Data":"3a4cde6e8a19fcef96675af82c7ac55430345cd6fe0ec3126eb88bbd6f089ac6"} Apr 17 11:18:54.230294 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:54.230269 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerStarted","Data":"a1615280fd3338a5bdd16b81640483393bed07e134e6a2f77c12fc52df30f0fa"} Apr 17 11:18:54.239460 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:54.239086 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5q6cg" podStartSLOduration=2.097571345 podStartE2EDuration="3.239073323s" podCreationTimestamp="2026-04-17 11:18:51 +0000 UTC" firstStartedPulling="2026-04-17 11:18:52.904704528 +0000 UTC m=+162.827559328" lastFinishedPulling="2026-04-17 11:18:54.046206506 +0000 UTC m=+163.969061306" observedRunningTime="2026-04-17 11:18:54.237827912 +0000 UTC m=+164.160682732" watchObservedRunningTime="2026-04-17 11:18:54.239073323 +0000 UTC m=+164.161928141" Apr 17 11:18:55.233992 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:55.233958 2568 generic.go:358] "Generic (PLEG): container finished" podID="2bf7c07f-cc95-4712-aa18-c07b4d35d1a3" containerID="0635e532f36733f4b7de2130122b37b64bfc7ecff4f4781d257607fa7430fc45" exitCode=0 Apr 17 11:18:55.234443 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:55.234050 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cq4d" event={"ID":"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3","Type":"ContainerDied","Data":"0635e532f36733f4b7de2130122b37b64bfc7ecff4f4781d257607fa7430fc45"} Apr 17 11:18:55.236003 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:55.235983 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" event={"ID":"78e0eced-4738-41b6-84e2-8c3d5dc008d8","Type":"ContainerStarted","Data":"b7993dc00491f5ac8929474ca39fa8b6f9ecfb36c56dff12590d63922a44793c"} Apr 17 11:18:55.236112 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:55.236008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" event={"ID":"78e0eced-4738-41b6-84e2-8c3d5dc008d8","Type":"ContainerStarted","Data":"669a96df95b8f0279cc36f8d03cc680de02a3028180a6e2b8d37f8336dfcf52d"} Apr 17 11:18:55.237328 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:55.237303 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerID="ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d" exitCode=0 Apr 17 11:18:55.237473 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:55.237337 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerDied","Data":"ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d"} Apr 17 11:18:55.294659 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:55.294610 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-nqq2j" podStartSLOduration=2.666189518 podStartE2EDuration="4.294596265s" podCreationTimestamp="2026-04-17 11:18:51 +0000 UTC" firstStartedPulling="2026-04-17 11:18:52.415927113 +0000 UTC m=+162.338781910" lastFinishedPulling="2026-04-17 11:18:54.044333861 +0000 UTC m=+163.967188657" observedRunningTime="2026-04-17 11:18:55.293016805 +0000 UTC m=+165.215871646" watchObservedRunningTime="2026-04-17 11:18:55.294596265 +0000 UTC m=+165.217451083" Apr 17 11:18:56.094401 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.094353 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-cbb9b68bb-dbb42"] Apr 17 11:18:56.097796 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.097771 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.102788 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.102766 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 11:18:56.102975 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.102952 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 11:18:56.103198 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.103174 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 11:18:56.103271 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.103204 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-xp2rb\"" Apr 17 11:18:56.103377 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.103363 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 11:18:56.103447 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.103396 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-adb3ht6a0p1g5\"" Apr 17 11:18:56.111816 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.111791 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-cbb9b68bb-dbb42"] Apr 17 11:18:56.184623 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.184593 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-secret-metrics-server-tls\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.184623 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.184625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/95f1ae67-46e1-4042-b30d-597a5948855c-audit-log\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.184823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.184645 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/95f1ae67-46e1-4042-b30d-597a5948855c-metrics-server-audit-profiles\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.184823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.184663 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzkg\" (UniqueName: \"kubernetes.io/projected/95f1ae67-46e1-4042-b30d-597a5948855c-kube-api-access-rxzkg\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.184823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.184686 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95f1ae67-46e1-4042-b30d-597a5948855c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.184823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.184745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-client-ca-bundle\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.184823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.184802 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-secret-metrics-server-client-certs\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.245619 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.245508 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cq4d" event={"ID":"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3","Type":"ContainerStarted","Data":"105255a3f31ebdb135bda610f43034c22db3f99aea4ec2f1b1442ea0635e2544"} Apr 17 11:18:56.245619 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.245552 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cq4d" event={"ID":"2bf7c07f-cc95-4712-aa18-c07b4d35d1a3","Type":"ContainerStarted","Data":"534d8a365d11d4ca1a95a472f44d732dc7f3e1424ccf9c0ef8e6e1d48c2b4192"} Apr 17 11:18:56.269550 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.269483 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7cq4d" podStartSLOduration=3.8592756489999998 podStartE2EDuration="5.26946913s" podCreationTimestamp="2026-04-17 11:18:51 +0000 UTC" firstStartedPulling="2026-04-17 11:18:52.634056845 +0000 UTC m=+162.556911655" lastFinishedPulling="2026-04-17 11:18:54.044250325 +0000 UTC m=+163.967105136" observedRunningTime="2026-04-17 11:18:56.26859108 +0000 UTC m=+166.191445919" watchObservedRunningTime="2026-04-17 11:18:56.26946913 +0000 UTC m=+166.192323947" Apr 17 11:18:56.285431 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.285406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-secret-metrics-server-client-certs\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.285562 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.285480 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-secret-metrics-server-tls\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.285562 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.285500 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/95f1ae67-46e1-4042-b30d-597a5948855c-audit-log\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.285562 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.285538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/95f1ae67-46e1-4042-b30d-597a5948855c-metrics-server-audit-profiles\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.285562 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.285557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzkg\" (UniqueName: \"kubernetes.io/projected/95f1ae67-46e1-4042-b30d-597a5948855c-kube-api-access-rxzkg\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.285752 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.285578 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95f1ae67-46e1-4042-b30d-597a5948855c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.285752 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.285602 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-client-ca-bundle\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.286088 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.286056 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/95f1ae67-46e1-4042-b30d-597a5948855c-audit-log\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.286343 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.286324 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95f1ae67-46e1-4042-b30d-597a5948855c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.286679 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.286653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/95f1ae67-46e1-4042-b30d-597a5948855c-metrics-server-audit-profiles\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.288032 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.288006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-secret-metrics-server-client-certs\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.288118 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.288059 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-client-ca-bundle\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.288205 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.288185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/95f1ae67-46e1-4042-b30d-597a5948855c-secret-metrics-server-tls\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.300299 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.300282 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzkg\" (UniqueName: \"kubernetes.io/projected/95f1ae67-46e1-4042-b30d-597a5948855c-kube-api-access-rxzkg\") pod \"metrics-server-cbb9b68bb-dbb42\" (UID: \"95f1ae67-46e1-4042-b30d-597a5948855c\") " pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.408600 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.408517 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:18:56.786780 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:56.786759 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-cbb9b68bb-dbb42"] Apr 17 11:18:56.788950 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:56.788926 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95f1ae67_46e1_4042_b30d_597a5948855c.slice/crio-23411c5a6a525bcc4d443146595000a78a51eae617f1e1429c919ea20efc5e40 WatchSource:0}: Error finding container 23411c5a6a525bcc4d443146595000a78a51eae617f1e1429c919ea20efc5e40: Status 404 returned error can't find the container with id 23411c5a6a525bcc4d443146595000a78a51eae617f1e1429c919ea20efc5e40 Apr 17 11:18:57.249876 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.249777 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" event={"ID":"95f1ae67-46e1-4042-b30d-597a5948855c","Type":"ContainerStarted","Data":"23411c5a6a525bcc4d443146595000a78a51eae617f1e1429c919ea20efc5e40"} Apr 17 11:18:57.252910 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.252877 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerStarted","Data":"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735"} Apr 17 11:18:57.253013 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.252919 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerStarted","Data":"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead"} Apr 17 11:18:57.253013 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.252937 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerStarted","Data":"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90"} Apr 17 11:18:57.253013 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.252949 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerStarted","Data":"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f"} Apr 17 11:18:57.253013 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.252961 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerStarted","Data":"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7"} Apr 17 11:18:57.902362 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.902270 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:18:57.906588 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.906561 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:57.911327 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.911303 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 11:18:57.911327 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.911323 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 11:18:57.911327 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.911304 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 11:18:57.911587 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.911310 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 11:18:57.913037 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.912510 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 11:18:57.913037 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.912516 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 11:18:57.913037 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.912614 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-947cv5euin9p9\"" Apr 17 11:18:57.913037 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.912515 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 11:18:57.913037 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.912525 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 11:18:57.913816 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.913666 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 11:18:57.913816 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.913684 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 11:18:57.913816 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.913696 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rbkbf\"" Apr 17 11:18:57.913816 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.913696 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 11:18:57.914638 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.914613 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 11:18:57.923612 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:57.923587 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:18:58.004455 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004621 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004475 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004621 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004498 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004621 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004539 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004621 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004555 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004783 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004783 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004693 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgfk5\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-kube-api-access-kgfk5\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004783 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004783 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.004783 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004777 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.005011 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004805 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.005011 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.005011 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004867 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.005011 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.005011 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004939 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.005011 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.004992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.005287 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.005078 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.005287 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.005129 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.106641 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106591 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.106804 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106651 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.106804 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.106804 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106741 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.106804 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106813 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgfk5\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-kube-api-access-kgfk5\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106844 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106872 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106928 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106955 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.106985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107359 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.107022 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107359 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.107060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107359 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.107088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107359 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.107141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107359 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.107168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.107359 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.107226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.108199 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.108890 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.110155 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.110964 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.111372 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.111871 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.112163 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112768 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.112437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.112768 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.112748 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.113592 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.113551 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.114247 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.114221 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.114695 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.114672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.115295 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.114940 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.115295 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.115230 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.115295 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.115231 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.115295 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.115288 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.117408 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.117364 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.120272 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.120248 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgfk5\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-kube-api-access-kgfk5\") pod \"prometheus-k8s-0\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.222438 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.222338 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:58.260599 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.260561 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerStarted","Data":"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347"} Apr 17 11:18:58.308548 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.308083 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.099827487 podStartE2EDuration="6.308062139s" podCreationTimestamp="2026-04-17 11:18:52 +0000 UTC" firstStartedPulling="2026-04-17 11:18:53.446120417 +0000 UTC m=+163.368975222" lastFinishedPulling="2026-04-17 11:18:57.654355062 +0000 UTC m=+167.577209874" observedRunningTime="2026-04-17 11:18:58.300956051 +0000 UTC m=+168.223810891" watchObservedRunningTime="2026-04-17 11:18:58.308062139 +0000 UTC m=+168.230916958" Apr 17 11:18:58.407225 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:58.407199 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:18:58.410896 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:58.410557 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3fee14c_4ad3_4e74_8743_1aadbb6a94de.slice/crio-a24b523872ebaa0d39da3d32f89e53f23c9f3684287dad1e2bc117daa8f65b5b WatchSource:0}: Error finding container a24b523872ebaa0d39da3d32f89e53f23c9f3684287dad1e2bc117daa8f65b5b: Status 404 returned error can't find the container with id a24b523872ebaa0d39da3d32f89e53f23c9f3684287dad1e2bc117daa8f65b5b Apr 17 11:18:59.264896 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.264802 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" event={"ID":"95f1ae67-46e1-4042-b30d-597a5948855c","Type":"ContainerStarted","Data":"ad9431cdc45f39b634b9f83ef15375ac7b90e89725585803a1b7b4a13376d818"} Apr 17 11:18:59.266228 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.266203 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerID="003d2c6676f2abb21c8f123bace6e4ff073b7ccd2979e1442f833913bc76fa35" exitCode=0 Apr 17 11:18:59.266325 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.266262 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerDied","Data":"003d2c6676f2abb21c8f123bace6e4ff073b7ccd2979e1442f833913bc76fa35"} Apr 17 11:18:59.266325 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.266287 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerStarted","Data":"a24b523872ebaa0d39da3d32f89e53f23c9f3684287dad1e2bc117daa8f65b5b"} Apr 17 11:18:59.284147 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.284107 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" podStartSLOduration=1.762110756 podStartE2EDuration="3.284095272s" podCreationTimestamp="2026-04-17 11:18:56 +0000 UTC" firstStartedPulling="2026-04-17 11:18:56.79152151 +0000 UTC m=+166.714376318" lastFinishedPulling="2026-04-17 11:18:58.313506035 +0000 UTC m=+168.236360834" observedRunningTime="2026-04-17 11:18:59.28317241 +0000 UTC m=+169.206027282" watchObservedRunningTime="2026-04-17 11:18:59.284095272 +0000 UTC m=+169.206950090" Apr 17 11:18:59.715790 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.715758 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:18:59.718819 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.718804 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j8tns\"" Apr 17 11:18:59.726683 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.726666 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8jgxz" Apr 17 11:18:59.852584 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:18:59.852531 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8jgxz"] Apr 17 11:18:59.856999 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:18:59.856965 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015be0d7_ff4e_4b65_b3ee_73d579ba395e.slice/crio-91d5a2c73a7b62d090ff2e8a616531bad25d11fe7f378941c86efb592a393127 WatchSource:0}: Error finding container 91d5a2c73a7b62d090ff2e8a616531bad25d11fe7f378941c86efb592a393127: Status 404 returned error can't find the container with id 91d5a2c73a7b62d090ff2e8a616531bad25d11fe7f378941c86efb592a393127 Apr 17 11:19:00.271805 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:00.271767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8jgxz" event={"ID":"015be0d7-ff4e-4b65-b3ee-73d579ba395e","Type":"ContainerStarted","Data":"91d5a2c73a7b62d090ff2e8a616531bad25d11fe7f378941c86efb592a393127"} Apr 17 11:19:00.719536 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:00.719472 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:19:00.722526 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:00.722501 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xz4zw\"" Apr 17 11:19:00.730765 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:00.730711 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:19:00.877115 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:00.877085 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-799f5b7986-bv6sb"] Apr 17 11:19:00.881274 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:19:00.880941 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f785fc_52f1_4f27_a4e9_f56d09ae67b2.slice/crio-b751163c885b507ff9bcfd5b71083593a9277e2abfbf83395103aee046241745 WatchSource:0}: Error finding container b751163c885b507ff9bcfd5b71083593a9277e2abfbf83395103aee046241745: Status 404 returned error can't find the container with id b751163c885b507ff9bcfd5b71083593a9277e2abfbf83395103aee046241745 Apr 17 11:19:01.277119 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:01.277075 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" event={"ID":"16f785fc-52f1-4f27-a4e9-f56d09ae67b2","Type":"ContainerStarted","Data":"e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff"} Apr 17 11:19:01.277119 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:01.277125 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" event={"ID":"16f785fc-52f1-4f27-a4e9-f56d09ae67b2","Type":"ContainerStarted","Data":"b751163c885b507ff9bcfd5b71083593a9277e2abfbf83395103aee046241745"} Apr 17 11:19:01.277668 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:01.277267 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:19:01.716417 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:01.716370 2568 scope.go:117] "RemoveContainer" containerID="e6734be68571b1e8dfc55691345eff37d962204b183ae685ea70194ff8b1fc2d" Apr 17 11:19:02.716522 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:02.716434 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:19:03.233060 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.233034 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hst8c" Apr 17 11:19:03.252805 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.252730 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" podStartSLOduration=172.252712511 podStartE2EDuration="2m52.252712511s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:19:01.29722419 +0000 UTC m=+171.220079009" watchObservedRunningTime="2026-04-17 11:19:03.252712511 +0000 UTC m=+173.175567331" Apr 17 11:19:03.287057 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.286916 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/2.log" Apr 17 11:19:03.287057 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.287043 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" event={"ID":"134e0312-09a3-4d5f-b641-3d6579587cde","Type":"ContainerStarted","Data":"7a0377b9eb7e8553da5d68bf8cfd2cfe965fef61da7f89277dfcac81a689e1e0"} Apr 17 11:19:03.287942 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.287864 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:19:03.290893 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.290722 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8jgxz" event={"ID":"015be0d7-ff4e-4b65-b3ee-73d579ba395e","Type":"ContainerStarted","Data":"167266889e9362752d25f77e2f6216feb60652d3667350d92e931f3a8404d850"} Apr 17 11:19:03.296022 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.295995 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerStarted","Data":"1db3aa0b0f25d74ba7857ca82014fd77e841490f5911e8bde7c74d61f361548e"} Apr 17 11:19:03.296123 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.296030 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerStarted","Data":"62d972efc151aa31e016650847f06049541a13e15b963a29632bfebec8668492"} Apr 17 11:19:03.296591 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.296568 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" Apr 17 11:19:03.304747 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.304465 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-22ddd" podStartSLOduration=45.523490223 podStartE2EDuration="47.304451528s" podCreationTimestamp="2026-04-17 11:18:16 +0000 UTC" firstStartedPulling="2026-04-17 11:18:17.152121995 +0000 UTC m=+127.074976791" lastFinishedPulling="2026-04-17 11:18:18.933083296 +0000 UTC m=+128.855938096" observedRunningTime="2026-04-17 11:19:03.303221212 +0000 UTC m=+173.226076022" watchObservedRunningTime="2026-04-17 11:19:03.304451528 +0000 UTC m=+173.227306347" Apr 17 11:19:03.320784 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:03.320743 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8jgxz" podStartSLOduration=138.78251151 podStartE2EDuration="2m21.320731525s" podCreationTimestamp="2026-04-17 11:16:42 +0000 UTC" firstStartedPulling="2026-04-17 11:18:59.859757066 +0000 UTC m=+169.782611861" lastFinishedPulling="2026-04-17 11:19:02.39797708 +0000 UTC m=+172.320831876" observedRunningTime="2026-04-17 11:19:03.319728484 +0000 UTC m=+173.242583302" watchObservedRunningTime="2026-04-17 11:19:03.320731525 +0000 UTC m=+173.243586343" Apr 17 11:19:05.307577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:05.307543 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerStarted","Data":"535eb0f5fe25e113838b745235c12e79054fc5558806d5081f5a0dc292256ef7"} Apr 17 11:19:05.307577 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:05.307582 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerStarted","Data":"307ee33c07092c7cee3ae5f4dd509470ec752be1c01e7a18e001d90c069a80fe"} Apr 17 11:19:05.308008 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:05.307591 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerStarted","Data":"a7448496c05dc950ac76213221c63e5c025381a4ccdd572d739480ebafb7503b"} Apr 17 11:19:05.308008 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:05.307600 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerStarted","Data":"b547c4ebbe7e0f647beabb3d1bf7d537f90e1743af5dd626d72a5c54ae5cef63"} Apr 17 11:19:05.342792 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:05.342743 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.088343449 podStartE2EDuration="8.342729278s" podCreationTimestamp="2026-04-17 11:18:57 +0000 UTC" firstStartedPulling="2026-04-17 11:18:59.26728622 +0000 UTC m=+169.190141016" lastFinishedPulling="2026-04-17 11:19:04.521672048 +0000 UTC m=+174.444526845" observedRunningTime="2026-04-17 11:19:05.340980257 +0000 UTC m=+175.263835075" watchObservedRunningTime="2026-04-17 11:19:05.342729278 +0000 UTC m=+175.265584095" Apr 17 11:19:05.562485 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:05.562416 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-799f5b7986-bv6sb"] Apr 17 11:19:08.222985 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:08.222939 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:15.568108 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:15.568078 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:19:16.409458 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:16.409419 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:19:16.409458 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:16.409459 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:19:30.580982 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.580929 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" podUID="16f785fc-52f1-4f27-a4e9-f56d09ae67b2" containerName="registry" containerID="cri-o://e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff" gracePeriod=30 Apr 17 11:19:30.826656 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.826634 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:19:30.911179 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911116 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") pod \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " Apr 17 11:19:30.911179 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911178 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-trusted-ca\") pod \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " Apr 17 11:19:30.911338 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911202 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-ca-trust-extracted\") pod \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " Apr 17 11:19:30.911338 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911251 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq5lk\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-kube-api-access-zq5lk\") pod \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " Apr 17 11:19:30.911338 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911274 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-image-registry-private-configuration\") pod \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " Apr 17 11:19:30.911338 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911300 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-installation-pull-secrets\") pod \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " Apr 17 11:19:30.911575 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911344 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-certificates\") pod \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " Apr 17 11:19:30.911575 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911372 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-bound-sa-token\") pod \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\" (UID: \"16f785fc-52f1-4f27-a4e9-f56d09ae67b2\") " Apr 17 11:19:30.911678 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.911641 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "16f785fc-52f1-4f27-a4e9-f56d09ae67b2" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:30.912102 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.912072 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "16f785fc-52f1-4f27-a4e9-f56d09ae67b2" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:30.913562 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.913516 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "16f785fc-52f1-4f27-a4e9-f56d09ae67b2" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:30.913785 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.913754 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "16f785fc-52f1-4f27-a4e9-f56d09ae67b2" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:30.914006 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.913971 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-kube-api-access-zq5lk" (OuterVolumeSpecName: "kube-api-access-zq5lk") pod "16f785fc-52f1-4f27-a4e9-f56d09ae67b2" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2"). InnerVolumeSpecName "kube-api-access-zq5lk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:30.914006 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.913983 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "16f785fc-52f1-4f27-a4e9-f56d09ae67b2" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:30.914169 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.914045 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "16f785fc-52f1-4f27-a4e9-f56d09ae67b2" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:30.919457 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:30.919433 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "16f785fc-52f1-4f27-a4e9-f56d09ae67b2" (UID: "16f785fc-52f1-4f27-a4e9-f56d09ae67b2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:19:31.012433 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.012409 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-trusted-ca\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:19:31.012433 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.012430 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-ca-trust-extracted\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:19:31.012582 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.012440 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zq5lk\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-kube-api-access-zq5lk\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:19:31.012582 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.012451 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-image-registry-private-configuration\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:19:31.012582 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.012460 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-installation-pull-secrets\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:19:31.012582 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.012469 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-certificates\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:19:31.012582 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.012478 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-bound-sa-token\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:19:31.012582 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.012486 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16f785fc-52f1-4f27-a4e9-f56d09ae67b2-registry-tls\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:19:31.391476 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.391441 2568 generic.go:358] "Generic (PLEG): container finished" podID="16f785fc-52f1-4f27-a4e9-f56d09ae67b2" containerID="e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff" exitCode=0 Apr 17 11:19:31.391721 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.391507 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" Apr 17 11:19:31.391721 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.391529 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" event={"ID":"16f785fc-52f1-4f27-a4e9-f56d09ae67b2","Type":"ContainerDied","Data":"e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff"} Apr 17 11:19:31.391721 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.391567 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-799f5b7986-bv6sb" event={"ID":"16f785fc-52f1-4f27-a4e9-f56d09ae67b2","Type":"ContainerDied","Data":"b751163c885b507ff9bcfd5b71083593a9277e2abfbf83395103aee046241745"} Apr 17 11:19:31.391721 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.391582 2568 scope.go:117] "RemoveContainer" containerID="e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff" Apr 17 11:19:31.400161 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.400145 2568 scope.go:117] "RemoveContainer" containerID="e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff" Apr 17 11:19:31.400453 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:19:31.400428 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff\": container with ID starting with e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff not found: ID does not exist" containerID="e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff" Apr 17 11:19:31.400538 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.400463 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff"} err="failed to get container status \"e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff\": rpc error: code = NotFound desc = could not find container \"e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff\": container with ID starting with e06a21a2f2b06c27c7f1393a951fb9c4352b3634be08de0b51773d12911519ff not found: ID does not exist" Apr 17 11:19:31.413090 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.413059 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-799f5b7986-bv6sb"] Apr 17 11:19:31.417354 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:31.417330 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-799f5b7986-bv6sb"] Apr 17 11:19:32.719195 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:32.719162 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f785fc-52f1-4f27-a4e9-f56d09ae67b2" path="/var/lib/kubelet/pods/16f785fc-52f1-4f27-a4e9-f56d09ae67b2/volumes" Apr 17 11:19:36.408649 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:36.408613 2568 generic.go:358] "Generic (PLEG): container finished" podID="bfdb9877-c4ed-40e3-9a4c-80fe70a2f755" containerID="0b74235cd70e20ea9064fab5c17d84647200104a014c0eb9b86cdf644db8a6b3" exitCode=0 Apr 17 11:19:36.408649 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:36.408641 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8q9jb" event={"ID":"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755","Type":"ContainerDied","Data":"0b74235cd70e20ea9064fab5c17d84647200104a014c0eb9b86cdf644db8a6b3"} Apr 17 11:19:36.409176 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:36.409077 2568 scope.go:117] "RemoveContainer" containerID="0b74235cd70e20ea9064fab5c17d84647200104a014c0eb9b86cdf644db8a6b3" Apr 17 11:19:36.413981 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:36.413962 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:19:36.438886 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:36.438860 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-cbb9b68bb-dbb42" Apr 17 11:19:37.413444 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:37.413409 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8q9jb" event={"ID":"bfdb9877-c4ed-40e3-9a4c-80fe70a2f755","Type":"ContainerStarted","Data":"06454d12665f47d37afe0ddbbb357cabc0d5e6195507fe23ae77e72f53383004"} Apr 17 11:19:58.222976 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:58.222935 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:58.242538 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:58.242512 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:58.492080 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:19:58.492009 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:12.026312 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.026269 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:12.026902 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.026851 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="alertmanager" containerID="cri-o://06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7" gracePeriod=120 Apr 17 11:20:12.027210 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.027090 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy" containerID="cri-o://9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead" gracePeriod=120 Apr 17 11:20:12.027210 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.027090 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="config-reloader" containerID="cri-o://e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f" gracePeriod=120 Apr 17 11:20:12.027210 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.027110 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy-web" containerID="cri-o://8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90" gracePeriod=120 Apr 17 11:20:12.027210 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.027118 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="prom-label-proxy" containerID="cri-o://856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347" gracePeriod=120 Apr 17 11:20:12.027210 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.027191 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy-metric" containerID="cri-o://af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735" gracePeriod=120 Apr 17 11:20:12.523643 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.523610 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerID="856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347" exitCode=0 Apr 17 11:20:12.523643 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.523633 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerID="9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead" exitCode=0 Apr 17 11:20:12.523643 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.523639 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerID="e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f" exitCode=0 Apr 17 11:20:12.523643 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.523646 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerID="06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7" exitCode=0 Apr 17 11:20:12.523904 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.523678 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerDied","Data":"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347"} Apr 17 11:20:12.523904 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.523709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerDied","Data":"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead"} Apr 17 11:20:12.523904 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.523720 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerDied","Data":"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f"} Apr 17 11:20:12.523904 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:12.523729 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerDied","Data":"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7"} Apr 17 11:20:13.267007 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.266985 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.379491 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379417 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-main-db\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379491 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379452 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379491 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379470 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-volume\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379491 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379488 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-metrics-client-ca\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379510 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-main-tls\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379532 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379567 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-web-config\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379596 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2rpd\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-kube-api-access-s2rpd\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379620 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-web\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379662 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-out\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379690 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-tls-assets\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379715 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-trusted-ca-bundle\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379744 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-cluster-tls-config\") pod \"9f50171b-dc98-42e7-b7b2-e1911230afd4\" (UID: \"9f50171b-dc98-42e7-b7b2-e1911230afd4\") " Apr 17 11:20:13.379794 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.379752 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:20:13.380276 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.380013 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:13.380276 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.380143 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-main-db\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.380276 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.380162 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-metrics-client-ca\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.381052 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.381019 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:13.382939 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.382793 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:13.382939 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.382896 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:13.382939 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.382899 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:13.383156 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.383038 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-kube-api-access-s2rpd" (OuterVolumeSpecName: "kube-api-access-s2rpd") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "kube-api-access-s2rpd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:13.383156 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.383133 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:13.383765 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.383744 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:13.383906 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.383877 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-volume" (OuterVolumeSpecName: "config-volume") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:13.384228 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.384200 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-out" (OuterVolumeSpecName: "config-out") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:20:13.387349 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.387321 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:13.393498 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.393475 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-web-config" (OuterVolumeSpecName: "web-config") pod "9f50171b-dc98-42e7-b7b2-e1911230afd4" (UID: "9f50171b-dc98-42e7-b7b2-e1911230afd4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:13.480557 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480533 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480557 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480558 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-volume\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480570 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-main-tls\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480580 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480589 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-web-config\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480598 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2rpd\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-kube-api-access-s2rpd\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480607 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480616 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f50171b-dc98-42e7-b7b2-e1911230afd4-config-out\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480624 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f50171b-dc98-42e7-b7b2-e1911230afd4-tls-assets\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480632 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f50171b-dc98-42e7-b7b2-e1911230afd4-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.480688 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.480640 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9f50171b-dc98-42e7-b7b2-e1911230afd4-cluster-tls-config\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:13.529108 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.529072 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerID="af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735" exitCode=0 Apr 17 11:20:13.529108 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.529101 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerID="8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90" exitCode=0 Apr 17 11:20:13.529259 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.529147 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerDied","Data":"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735"} Apr 17 11:20:13.529259 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.529183 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerDied","Data":"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90"} Apr 17 11:20:13.529259 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.529193 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.529259 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.529204 2568 scope.go:117] "RemoveContainer" containerID="856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347" Apr 17 11:20:13.529455 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.529193 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9f50171b-dc98-42e7-b7b2-e1911230afd4","Type":"ContainerDied","Data":"a1615280fd3338a5bdd16b81640483393bed07e134e6a2f77c12fc52df30f0fa"} Apr 17 11:20:13.537297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.537280 2568 scope.go:117] "RemoveContainer" containerID="af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735" Apr 17 11:20:13.543836 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.543819 2568 scope.go:117] "RemoveContainer" containerID="9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead" Apr 17 11:20:13.550056 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.550041 2568 scope.go:117] "RemoveContainer" containerID="8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90" Apr 17 11:20:13.554522 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.554487 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:13.557105 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.557090 2568 scope.go:117] "RemoveContainer" containerID="e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f" Apr 17 11:20:13.560907 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.560886 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:13.563569 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.563552 2568 scope.go:117] "RemoveContainer" containerID="06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7" Apr 17 11:20:13.569601 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.569587 2568 scope.go:117] "RemoveContainer" containerID="ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d" Apr 17 11:20:13.575786 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.575770 2568 scope.go:117] "RemoveContainer" containerID="856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347" Apr 17 11:20:13.576007 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:20:13.575991 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347\": container with ID starting with 856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347 not found: ID does not exist" containerID="856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347" Apr 17 11:20:13.576075 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.576017 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347"} err="failed to get container status \"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347\": rpc error: code = NotFound desc = could not find container \"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347\": container with ID starting with 856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347 not found: ID does not exist" Apr 17 11:20:13.576075 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.576042 2568 scope.go:117] "RemoveContainer" containerID="af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735" Apr 17 11:20:13.576277 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:20:13.576259 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735\": container with ID starting with af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735 not found: ID does not exist" containerID="af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735" Apr 17 11:20:13.576313 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.576283 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735"} err="failed to get container status \"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735\": rpc error: code = NotFound desc = could not find container \"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735\": container with ID starting with af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735 not found: ID does not exist" Apr 17 11:20:13.576313 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.576298 2568 scope.go:117] "RemoveContainer" containerID="9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead" Apr 17 11:20:13.576650 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:20:13.576631 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead\": container with ID starting with 9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead not found: ID does not exist" containerID="9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead" Apr 17 11:20:13.576714 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.576655 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead"} err="failed to get container status \"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead\": rpc error: code = NotFound desc = could not find container \"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead\": container with ID starting with 9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead not found: ID does not exist" Apr 17 11:20:13.576714 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.576670 2568 scope.go:117] "RemoveContainer" containerID="8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90" Apr 17 11:20:13.576910 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:20:13.576891 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90\": container with ID starting with 8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90 not found: ID does not exist" containerID="8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90" Apr 17 11:20:13.576948 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.576914 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90"} err="failed to get container status \"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90\": rpc error: code = NotFound desc = could not find container \"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90\": container with ID starting with 8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90 not found: ID does not exist" Apr 17 11:20:13.576948 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.576928 2568 scope.go:117] "RemoveContainer" containerID="e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f" Apr 17 11:20:13.577148 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:20:13.577132 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f\": container with ID starting with e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f not found: ID does not exist" containerID="e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f" Apr 17 11:20:13.577184 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.577154 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f"} err="failed to get container status \"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f\": rpc error: code = NotFound desc = could not find container \"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f\": container with ID starting with e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f not found: ID does not exist" Apr 17 11:20:13.577184 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.577168 2568 scope.go:117] "RemoveContainer" containerID="06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7" Apr 17 11:20:13.577419 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:20:13.577401 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7\": container with ID starting with 06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7 not found: ID does not exist" containerID="06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7" Apr 17 11:20:13.577466 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.577423 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7"} err="failed to get container status \"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7\": rpc error: code = NotFound desc = could not find container \"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7\": container with ID starting with 06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7 not found: ID does not exist" Apr 17 11:20:13.577466 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.577437 2568 scope.go:117] "RemoveContainer" containerID="ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d" Apr 17 11:20:13.577656 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:20:13.577639 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d\": container with ID starting with ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d not found: ID does not exist" containerID="ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d" Apr 17 11:20:13.577699 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.577660 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d"} err="failed to get container status \"ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d\": rpc error: code = NotFound desc = could not find container \"ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d\": container with ID starting with ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d not found: ID does not exist" Apr 17 11:20:13.577699 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.577672 2568 scope.go:117] "RemoveContainer" containerID="856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347" Apr 17 11:20:13.577902 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.577884 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347"} err="failed to get container status \"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347\": rpc error: code = NotFound desc = could not find container \"856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347\": container with ID starting with 856eee4f408efbe5dbff0ec4882ea38bb17d959b5eb6ad3dab0421d5ad137347 not found: ID does not exist" Apr 17 11:20:13.577947 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.577902 2568 scope.go:117] "RemoveContainer" containerID="af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735" Apr 17 11:20:13.578117 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578089 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735"} err="failed to get container status \"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735\": rpc error: code = NotFound desc = could not find container \"af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735\": container with ID starting with af15e6aafd235d16d7e368eb4cf73735854f438a07ca006f1f61095845854735 not found: ID does not exist" Apr 17 11:20:13.578185 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578119 2568 scope.go:117] "RemoveContainer" containerID="9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead" Apr 17 11:20:13.578339 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578322 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead"} err="failed to get container status \"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead\": rpc error: code = NotFound desc = could not find container \"9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead\": container with ID starting with 9455438b5bc1f2fd447878f3a50dc5711f727a7ab7e8ee4e5724ada06e950ead not found: ID does not exist" Apr 17 11:20:13.578402 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578340 2568 scope.go:117] "RemoveContainer" containerID="8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90" Apr 17 11:20:13.578579 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578554 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90"} err="failed to get container status \"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90\": rpc error: code = NotFound desc = could not find container \"8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90\": container with ID starting with 8e22a0d032ecb7a49297926a810be25629c2123d885a1de61f2e7deee4ca2c90 not found: ID does not exist" Apr 17 11:20:13.578622 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578579 2568 scope.go:117] "RemoveContainer" containerID="e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f" Apr 17 11:20:13.578788 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578769 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f"} err="failed to get container status \"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f\": rpc error: code = NotFound desc = could not find container \"e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f\": container with ID starting with e5ff93afd831fe67c0a6bc26e3869e87c67e93dabc7226cc2192b265e17e0d4f not found: ID does not exist" Apr 17 11:20:13.578853 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578790 2568 scope.go:117] "RemoveContainer" containerID="06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7" Apr 17 11:20:13.578985 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578968 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7"} err="failed to get container status \"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7\": rpc error: code = NotFound desc = could not find container \"06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7\": container with ID starting with 06c496a0c786e6b6c0b597ed05b603f0e6db6ee32103f01b98850de4f8a552f7 not found: ID does not exist" Apr 17 11:20:13.579029 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.578987 2568 scope.go:117] "RemoveContainer" containerID="ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d" Apr 17 11:20:13.579194 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.579177 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d"} err="failed to get container status \"ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d\": rpc error: code = NotFound desc = could not find container \"ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d\": container with ID starting with ccf967e21c8c0bf235e2cc11f1148b4ae9a9a9cbe1a5931981c36fde264fa39d not found: ID does not exist" Apr 17 11:20:13.595661 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.595639 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:13.595954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.595943 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy-metric" Apr 17 11:20:13.595995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.595956 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy-metric" Apr 17 11:20:13.595995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.595967 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16f785fc-52f1-4f27-a4e9-f56d09ae67b2" containerName="registry" Apr 17 11:20:13.595995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.595972 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f785fc-52f1-4f27-a4e9-f56d09ae67b2" containerName="registry" Apr 17 11:20:13.595995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.595983 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="init-config-reloader" Apr 17 11:20:13.595995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.595989 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="init-config-reloader" Apr 17 11:20:13.595995 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.595995 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="config-reloader" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596001 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="config-reloader" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596013 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy-web" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596018 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy-web" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596025 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596029 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596034 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="prom-label-proxy" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596039 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="prom-label-proxy" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596046 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="alertmanager" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596051 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="alertmanager" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596094 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy-metric" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596101 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="16f785fc-52f1-4f27-a4e9-f56d09ae67b2" containerName="registry" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596108 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="alertmanager" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596114 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy-web" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596119 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="kube-rbac-proxy" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596127 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="prom-label-proxy" Apr 17 11:20:13.596164 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.596133 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" containerName="config-reloader" Apr 17 11:20:13.601014 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.600999 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.604293 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.604650 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.604688 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.604909 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.604926 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.604959 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.605231 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.605651 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dfl4n\"" Apr 17 11:20:13.607025 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.605703 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 11:20:13.611236 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.611216 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 11:20:13.617930 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.617886 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:13.682707 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682653 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682707 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682688 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682707 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682707 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9b67\" (UniqueName: \"kubernetes.io/projected/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-kube-api-access-z9b67\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682855 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682755 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-web-config\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682855 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682855 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-config-volume\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682855 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682840 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682992 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682871 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-tls-assets\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682992 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682897 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-config-out\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682992 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682946 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.682992 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.682983 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.683110 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.683021 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.683110 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.683041 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784070 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784045 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784185 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9b67\" (UniqueName: \"kubernetes.io/projected/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-kube-api-access-z9b67\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784185 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-web-config\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784185 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784264 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-config-volume\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784336 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784482 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-tls-assets\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784632 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784583 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-config-out\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784700 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784760 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784697 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784760 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784740 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784760 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784743 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784914 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.784914 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.784843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.785222 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.785199 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.785305 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.785275 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.787097 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.787070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-config-out\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.787200 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.787096 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.787200 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.787144 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-tls-assets\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.787447 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.787423 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-web-config\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.787535 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.787430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.787584 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.787535 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.787942 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.787915 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.788504 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.788485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-config-volume\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.788956 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.788936 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.793623 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.793605 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9b67\" (UniqueName: \"kubernetes.io/projected/788c2618-9fc8-4d4c-9e7d-f72f5c6b6938-kube-api-access-z9b67\") pod \"alertmanager-main-0\" (UID: \"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:13.916193 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:13.916166 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:14.255120 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:14.255096 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:14.257099 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:20:14.257067 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod788c2618_9fc8_4d4c_9e7d_f72f5c6b6938.slice/crio-036d1c204647885411172b83ea362c1555045ace2a62aa5671b6bbe60ca4ec17 WatchSource:0}: Error finding container 036d1c204647885411172b83ea362c1555045ace2a62aa5671b6bbe60ca4ec17: Status 404 returned error can't find the container with id 036d1c204647885411172b83ea362c1555045ace2a62aa5671b6bbe60ca4ec17 Apr 17 11:20:14.534724 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:14.534684 2568 generic.go:358] "Generic (PLEG): container finished" podID="788c2618-9fc8-4d4c-9e7d-f72f5c6b6938" containerID="6e38a0ecb0b98bbb486c293dd0b42f26b85b2d0e487e07fd80bd309e30f3d0ce" exitCode=0 Apr 17 11:20:14.535184 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:14.534747 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938","Type":"ContainerDied","Data":"6e38a0ecb0b98bbb486c293dd0b42f26b85b2d0e487e07fd80bd309e30f3d0ce"} Apr 17 11:20:14.535184 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:14.534767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938","Type":"ContainerStarted","Data":"036d1c204647885411172b83ea362c1555045ace2a62aa5671b6bbe60ca4ec17"} Apr 17 11:20:14.721922 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:14.721888 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f50171b-dc98-42e7-b7b2-e1911230afd4" path="/var/lib/kubelet/pods/9f50171b-dc98-42e7-b7b2-e1911230afd4/volumes" Apr 17 11:20:15.540509 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:15.540477 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938","Type":"ContainerStarted","Data":"34b9625337709c29fe7b199fe4da292d39e6ced9145aef0201389cf8f06a43a2"} Apr 17 11:20:15.540509 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:15.540512 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938","Type":"ContainerStarted","Data":"1f7441dfa188fe29d6b966c9287f0579037e697a2411fdd2813d815cf3bd83c9"} Apr 17 11:20:15.540895 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:15.540521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938","Type":"ContainerStarted","Data":"5a16f11a23c1c24d9e94ae275d2e3f5e8e79e3bf7c16f0a858b21ca03e26571e"} Apr 17 11:20:15.540895 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:15.540529 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938","Type":"ContainerStarted","Data":"68f09eaf924cc8e25b20194d367d660d92215694bc5ee9d35fa506997e04ce86"} Apr 17 11:20:15.540895 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:15.540536 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938","Type":"ContainerStarted","Data":"e543ffac533f598472a3bb424f803edde734de65413cbbfc7a5f1f19a221c992"} Apr 17 11:20:15.540895 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:15.540543 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"788c2618-9fc8-4d4c-9e7d-f72f5c6b6938","Type":"ContainerStarted","Data":"76c5eb0523eae7f036a63e8fdb737fbd8209327c4750d22d821acf2125e26b2f"} Apr 17 11:20:15.586670 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:15.586625 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.586611455 podStartE2EDuration="2.586611455s" podCreationTimestamp="2026-04-17 11:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:15.583640038 +0000 UTC m=+245.506494879" watchObservedRunningTime="2026-04-17 11:20:15.586611455 +0000 UTC m=+245.509466273" Apr 17 11:20:16.094148 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.094107 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-bb88f555c-k7tn4"] Apr 17 11:20:16.097723 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.097705 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.122427 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.122371 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 11:20:16.122618 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.122462 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 11:20:16.122618 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.122540 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 11:20:16.123180 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.123158 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 11:20:16.123180 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.123171 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-xf9qn\"" Apr 17 11:20:16.123364 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.123211 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 11:20:16.124699 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.124673 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-bb88f555c-k7tn4"] Apr 17 11:20:16.128557 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.128530 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 11:20:16.205060 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.205025 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-federate-client-tls\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.205266 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.205124 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.205266 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.205220 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-telemeter-client-tls\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.205488 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.205306 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-serving-certs-ca-bundle\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.205488 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.205365 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98ssq\" (UniqueName: \"kubernetes.io/projected/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-kube-api-access-98ssq\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.205488 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.205420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-secret-telemeter-client\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.205488 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.205455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.205488 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.205485 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-metrics-client-ca\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.305852 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.305824 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-serving-certs-ca-bundle\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306029 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.305863 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98ssq\" (UniqueName: \"kubernetes.io/projected/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-kube-api-access-98ssq\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306029 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.305885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-secret-telemeter-client\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306029 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.305906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306029 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.306016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-metrics-client-ca\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306261 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.306044 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-federate-client-tls\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306261 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.306119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306261 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.306194 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-telemeter-client-tls\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306695 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.306672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-serving-certs-ca-bundle\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.306774 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.306711 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-metrics-client-ca\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.307043 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.307011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.309263 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.309242 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-secret-telemeter-client\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.309263 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.309255 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-telemeter-client-tls\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.309517 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.309497 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.309656 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.309643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-federate-client-tls\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.315864 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.315844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98ssq\" (UniqueName: \"kubernetes.io/projected/b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3-kube-api-access-98ssq\") pod \"telemeter-client-bb88f555c-k7tn4\" (UID: \"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3\") " pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.391938 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.391841 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:16.392423 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.392352 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="prometheus" containerID="cri-o://62d972efc151aa31e016650847f06049541a13e15b963a29632bfebec8668492" gracePeriod=600 Apr 17 11:20:16.392537 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.392434 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="config-reloader" containerID="cri-o://1db3aa0b0f25d74ba7857ca82014fd77e841490f5911e8bde7c74d61f361548e" gracePeriod=600 Apr 17 11:20:16.392537 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.392468 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy-web" containerID="cri-o://a7448496c05dc950ac76213221c63e5c025381a4ccdd572d739480ebafb7503b" gracePeriod=600 Apr 17 11:20:16.392537 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.392422 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="thanos-sidecar" containerID="cri-o://b547c4ebbe7e0f647beabb3d1bf7d537f90e1743af5dd626d72a5c54ae5cef63" gracePeriod=600 Apr 17 11:20:16.392537 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.392360 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy" containerID="cri-o://307ee33c07092c7cee3ae5f4dd509470ec752be1c01e7a18e001d90c069a80fe" gracePeriod=600 Apr 17 11:20:16.393281 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.392787 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy-thanos" containerID="cri-o://535eb0f5fe25e113838b745235c12e79054fc5558806d5081f5a0dc292256ef7" gracePeriod=600 Apr 17 11:20:16.408870 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.408623 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" Apr 17 11:20:16.548297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548229 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerID="535eb0f5fe25e113838b745235c12e79054fc5558806d5081f5a0dc292256ef7" exitCode=0 Apr 17 11:20:16.548297 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548286 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerDied","Data":"535eb0f5fe25e113838b745235c12e79054fc5558806d5081f5a0dc292256ef7"} Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548317 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerID="307ee33c07092c7cee3ae5f4dd509470ec752be1c01e7a18e001d90c069a80fe" exitCode=0 Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548326 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerID="a7448496c05dc950ac76213221c63e5c025381a4ccdd572d739480ebafb7503b" exitCode=0 Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548327 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerDied","Data":"307ee33c07092c7cee3ae5f4dd509470ec752be1c01e7a18e001d90c069a80fe"} Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548343 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerDied","Data":"a7448496c05dc950ac76213221c63e5c025381a4ccdd572d739480ebafb7503b"} Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerDied","Data":"b547c4ebbe7e0f647beabb3d1bf7d537f90e1743af5dd626d72a5c54ae5cef63"} Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548331 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerID="b547c4ebbe7e0f647beabb3d1bf7d537f90e1743af5dd626d72a5c54ae5cef63" exitCode=0 Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548403 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerID="1db3aa0b0f25d74ba7857ca82014fd77e841490f5911e8bde7c74d61f361548e" exitCode=0 Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548417 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerID="62d972efc151aa31e016650847f06049541a13e15b963a29632bfebec8668492" exitCode=0 Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548487 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerDied","Data":"1db3aa0b0f25d74ba7857ca82014fd77e841490f5911e8bde7c74d61f361548e"} Apr 17 11:20:16.548778 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.548514 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerDied","Data":"62d972efc151aa31e016650847f06049541a13e15b963a29632bfebec8668492"} Apr 17 11:20:16.572051 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.572030 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-bb88f555c-k7tn4"] Apr 17 11:20:16.574054 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:20:16.574023 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bd3d6c_e911_4b10_a3b3_87d27acc1dd3.slice/crio-c123ea67e897b67b907affbef2ac7a7c630cd27b70456021ba21d802e36869b9 WatchSource:0}: Error finding container c123ea67e897b67b907affbef2ac7a7c630cd27b70456021ba21d802e36869b9: Status 404 returned error can't find the container with id c123ea67e897b67b907affbef2ac7a7c630cd27b70456021ba21d802e36869b9 Apr 17 11:20:16.657541 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.657518 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:16.713017 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.712990 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-db\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713183 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713044 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-tls\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713183 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713085 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-metrics-client-certs\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713183 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713101 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-thanos-prometheus-http-client-file\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713183 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713138 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgfk5\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-kube-api-access-kgfk5\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713429 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713312 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-web-config\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713429 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713360 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713429 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713403 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713589 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713438 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config-out\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713589 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713471 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-metrics-client-ca\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713589 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713502 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713589 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713547 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-serving-certs-ca-bundle\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713589 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713580 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-tls-assets\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713834 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713605 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-kube-rbac-proxy\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713834 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713639 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-kubelet-serving-ca-bundle\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713834 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713690 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-trusted-ca-bundle\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713834 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713715 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-rulefiles-0\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.713834 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.713741 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-grpc-tls\") pod \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\" (UID: \"f3fee14c-4ad3-4e74-8743-1aadbb6a94de\") " Apr 17 11:20:16.714839 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.714570 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:16.714839 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.714652 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:16.715558 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.715529 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:16.715646 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.715605 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:20:16.716261 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.716230 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:16.717469 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.717421 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.717714 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.717690 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config" (OuterVolumeSpecName: "config") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.717851 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.717817 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.718002 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.717977 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.718121 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.718085 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:16.718377 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.718351 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.718604 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.718572 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:16.718810 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.718779 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.719545 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.719493 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.720064 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.720018 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config-out" (OuterVolumeSpecName: "config-out") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:20:16.720064 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.720044 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-kube-api-access-kgfk5" (OuterVolumeSpecName: "kube-api-access-kgfk5") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "kube-api-access-kgfk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:16.721810 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.721785 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.736685 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.736652 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-web-config" (OuterVolumeSpecName: "web-config") pod "f3fee14c-4ad3-4e74-8743-1aadbb6a94de" (UID: "f3fee14c-4ad3-4e74-8743-1aadbb6a94de"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:16.814787 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814756 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kgfk5\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-kube-api-access-kgfk5\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814787 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814785 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-web-config\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814796 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814805 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814814 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config-out\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814824 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-metrics-client-ca\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814832 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-config\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814841 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814849 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-tls-assets\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814858 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-kube-rbac-proxy\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814866 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814877 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-trusted-ca-bundle\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814887 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814895 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-grpc-tls\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814903 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-prometheus-k8s-db\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814913 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-prometheus-k8s-tls\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814923 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-secret-metrics-client-certs\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:16.814969 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:16.814932 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3fee14c-4ad3-4e74-8743-1aadbb6a94de-thanos-prometheus-http-client-file\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:20:17.553946 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.553908 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" event={"ID":"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3","Type":"ContainerStarted","Data":"c123ea67e897b67b907affbef2ac7a7c630cd27b70456021ba21d802e36869b9"} Apr 17 11:20:17.557343 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.557313 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3fee14c-4ad3-4e74-8743-1aadbb6a94de","Type":"ContainerDied","Data":"a24b523872ebaa0d39da3d32f89e53f23c9f3684287dad1e2bc117daa8f65b5b"} Apr 17 11:20:17.557501 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.557360 2568 scope.go:117] "RemoveContainer" containerID="535eb0f5fe25e113838b745235c12e79054fc5558806d5081f5a0dc292256ef7" Apr 17 11:20:17.557501 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.557410 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.566488 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.566419 2568 scope.go:117] "RemoveContainer" containerID="307ee33c07092c7cee3ae5f4dd509470ec752be1c01e7a18e001d90c069a80fe" Apr 17 11:20:17.574831 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.574808 2568 scope.go:117] "RemoveContainer" containerID="a7448496c05dc950ac76213221c63e5c025381a4ccdd572d739480ebafb7503b" Apr 17 11:20:17.582716 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.582700 2568 scope.go:117] "RemoveContainer" containerID="b547c4ebbe7e0f647beabb3d1bf7d537f90e1743af5dd626d72a5c54ae5cef63" Apr 17 11:20:17.590470 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.590430 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:17.591041 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.591022 2568 scope.go:117] "RemoveContainer" containerID="1db3aa0b0f25d74ba7857ca82014fd77e841490f5911e8bde7c74d61f361548e" Apr 17 11:20:17.598092 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.598071 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:17.599100 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.599086 2568 scope.go:117] "RemoveContainer" containerID="62d972efc151aa31e016650847f06049541a13e15b963a29632bfebec8668492" Apr 17 11:20:17.606265 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.606249 2568 scope.go:117] "RemoveContainer" containerID="003d2c6676f2abb21c8f123bace6e4ff073b7ccd2979e1442f833913bc76fa35" Apr 17 11:20:17.639964 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.639938 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:17.640313 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640300 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="thanos-sidecar" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640315 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="thanos-sidecar" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640325 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy-web" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640331 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy-web" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640337 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640342 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640358 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy-thanos" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640363 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy-thanos" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640371 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="config-reloader" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640392 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="config-reloader" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640406 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="prometheus" Apr 17 11:20:17.640415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640414 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="prometheus" Apr 17 11:20:17.640838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640422 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="init-config-reloader" Apr 17 11:20:17.640838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640431 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="init-config-reloader" Apr 17 11:20:17.640838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640499 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="thanos-sidecar" Apr 17 11:20:17.640838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640509 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="prometheus" Apr 17 11:20:17.640838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640518 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy-web" Apr 17 11:20:17.640838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640526 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy" Apr 17 11:20:17.640838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640535 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="kube-rbac-proxy-thanos" Apr 17 11:20:17.640838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.640545 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" containerName="config-reloader" Apr 17 11:20:17.644435 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.644410 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.647949 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.647802 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 11:20:17.647949 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.647859 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 11:20:17.648137 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.647980 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 11:20:17.648137 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.647994 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 11:20:17.648137 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.648080 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 11:20:17.648137 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.648101 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-947cv5euin9p9\"" Apr 17 11:20:17.648371 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.648174 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 11:20:17.648371 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.648222 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 11:20:17.648371 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.648224 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rbkbf\"" Apr 17 11:20:17.648371 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.648278 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 11:20:17.648800 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.648786 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 11:20:17.650083 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.650056 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 11:20:17.657462 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.657439 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 11:20:17.659183 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.659163 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 11:20:17.663878 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.663858 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:17.722294 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-config\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722450 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722299 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722450 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722327 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722450 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722344 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722450 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722391 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722450 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722417 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722755 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722493 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722755 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722755 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722755 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722755 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722755 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722731 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-config-out\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.722755 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722755 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.723005 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5m88\" (UniqueName: \"kubernetes.io/projected/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-kube-api-access-r5m88\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.723005 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722805 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.723005 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722848 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.723005 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722918 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-web-config\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.723005 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.722945 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824285 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-config\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824285 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824246 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824285 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824337 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824362 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824413 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824471 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824572 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824540 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824583 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-config-out\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5m88\" (UniqueName: \"kubernetes.io/projected/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-kube-api-access-r5m88\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824734 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-web-config\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.824954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.824761 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.825309 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.825074 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.826833 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.825533 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.826833 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.825695 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.826833 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.826354 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.826833 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.826529 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.829823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.827718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.829823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.827718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.829823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.829024 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.829823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.829483 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.829823 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.829671 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.830363 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.830322 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-config-out\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.830571 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.830548 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.831018 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.830973 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-config\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.831143 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.831120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.831539 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.831519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.831610 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.831552 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.832710 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.832686 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-web-config\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.834798 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.834766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5m88\" (UniqueName: \"kubernetes.io/projected/9cda29a4-4b2d-4bc1-98d7-6affcd01044b-kube-api-access-r5m88\") pod \"prometheus-k8s-0\" (UID: \"9cda29a4-4b2d-4bc1-98d7-6affcd01044b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:17.958093 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:17.958056 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:18.325248 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.325217 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:20:18.326667 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:20:18.326639 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cda29a4_4b2d_4bc1_98d7_6affcd01044b.slice/crio-f29b8d7247b5d6a8061b4295b57bba0b4847506a16316c513b7ccd0c2a1bb0e6 WatchSource:0}: Error finding container f29b8d7247b5d6a8061b4295b57bba0b4847506a16316c513b7ccd0c2a1bb0e6: Status 404 returned error can't find the container with id f29b8d7247b5d6a8061b4295b57bba0b4847506a16316c513b7ccd0c2a1bb0e6 Apr 17 11:20:18.562919 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.562884 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" event={"ID":"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3","Type":"ContainerStarted","Data":"158cd8a3424b4c44fa9d4b34450db11ca2c7394c59851bb883c662bb527e7ac1"} Apr 17 11:20:18.562919 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.562925 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" event={"ID":"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3","Type":"ContainerStarted","Data":"4f41e29e9160a680aca44256700e30e6fe4ae0c3ae294393c420d3fa92e3f1ba"} Apr 17 11:20:18.563415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.562936 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" event={"ID":"b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3","Type":"ContainerStarted","Data":"e55ac7f6d5af3d5f9d3048cd2a8cffbdcad2248d2901250c1fd60fc8c4f1c65d"} Apr 17 11:20:18.564942 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.564922 2568 generic.go:358] "Generic (PLEG): container finished" podID="9cda29a4-4b2d-4bc1-98d7-6affcd01044b" containerID="7227a5059463943b362179a90843940413f0614fd4286520be70df37878e367d" exitCode=0 Apr 17 11:20:18.565037 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.564950 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cda29a4-4b2d-4bc1-98d7-6affcd01044b","Type":"ContainerDied","Data":"7227a5059463943b362179a90843940413f0614fd4286520be70df37878e367d"} Apr 17 11:20:18.565037 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.564967 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cda29a4-4b2d-4bc1-98d7-6affcd01044b","Type":"ContainerStarted","Data":"f29b8d7247b5d6a8061b4295b57bba0b4847506a16316c513b7ccd0c2a1bb0e6"} Apr 17 11:20:18.594062 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.594024 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-bb88f555c-k7tn4" podStartSLOduration=0.919740462 podStartE2EDuration="2.594010757s" podCreationTimestamp="2026-04-17 11:20:16 +0000 UTC" firstStartedPulling="2026-04-17 11:20:16.575913885 +0000 UTC m=+246.498768681" lastFinishedPulling="2026-04-17 11:20:18.25018418 +0000 UTC m=+248.173038976" observedRunningTime="2026-04-17 11:20:18.592196153 +0000 UTC m=+248.515050970" watchObservedRunningTime="2026-04-17 11:20:18.594010757 +0000 UTC m=+248.516865576" Apr 17 11:20:18.720517 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:18.720490 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3fee14c-4ad3-4e74-8743-1aadbb6a94de" path="/var/lib/kubelet/pods/f3fee14c-4ad3-4e74-8743-1aadbb6a94de/volumes" Apr 17 11:20:19.571933 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:19.571844 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cda29a4-4b2d-4bc1-98d7-6affcd01044b","Type":"ContainerStarted","Data":"8626b8220401a194428f5edb1a98adc7541bec49e70e4ce05a784b2dec6457be"} Apr 17 11:20:19.571933 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:19.571890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cda29a4-4b2d-4bc1-98d7-6affcd01044b","Type":"ContainerStarted","Data":"608c9fe3da7d8f88a3d4f78ff1c381145a968e726bed69656e0ff35217089e76"} Apr 17 11:20:19.571933 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:19.571901 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cda29a4-4b2d-4bc1-98d7-6affcd01044b","Type":"ContainerStarted","Data":"9b132552468d47d5de25409a2e9e3eba689f7a8a8dabdbc72f928e36fa7f5de4"} Apr 17 11:20:19.571933 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:19.571910 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cda29a4-4b2d-4bc1-98d7-6affcd01044b","Type":"ContainerStarted","Data":"499f1394f68ec386985251e2120279e72d185d1235865887f473addd585820b3"} Apr 17 11:20:19.571933 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:19.571918 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cda29a4-4b2d-4bc1-98d7-6affcd01044b","Type":"ContainerStarted","Data":"6466eb4c0579318a823dad2f0286742b2893c6d47ba6733a78ed21d2174dbbeb"} Apr 17 11:20:19.571933 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:19.571927 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cda29a4-4b2d-4bc1-98d7-6affcd01044b","Type":"ContainerStarted","Data":"6e0adc6853a69a589cdb4e98d05b2a8410b9342ac3fdc1f3ea6a9246abecd7f7"} Apr 17 11:20:19.619174 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:19.619125 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.6191112050000003 podStartE2EDuration="2.619111205s" podCreationTimestamp="2026-04-17 11:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:19.617147129 +0000 UTC m=+249.540001946" watchObservedRunningTime="2026-04-17 11:20:19.619111205 +0000 UTC m=+249.541966085" Apr 17 11:20:22.465261 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:22.465208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:20:22.467566 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:22.467533 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d84dc363-0ebb-4e0c-9b94-1024f80ccbb3-metrics-certs\") pod \"network-metrics-daemon-4d5rw\" (UID: \"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3\") " pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:20:22.520238 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:22.520208 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-g9l5h\"" Apr 17 11:20:22.527940 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:22.527921 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4d5rw" Apr 17 11:20:22.856621 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:22.856589 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4d5rw"] Apr 17 11:20:22.859793 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:20:22.859767 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd84dc363_0ebb_4e0c_9b94_1024f80ccbb3.slice/crio-1e79e5817a3879fa9f6fea3185e2939035d1b48dae4fb0b28778b193d59f3eb3 WatchSource:0}: Error finding container 1e79e5817a3879fa9f6fea3185e2939035d1b48dae4fb0b28778b193d59f3eb3: Status 404 returned error can't find the container with id 1e79e5817a3879fa9f6fea3185e2939035d1b48dae4fb0b28778b193d59f3eb3 Apr 17 11:20:22.958588 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:22.958554 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:23.591344 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:23.591309 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4d5rw" event={"ID":"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3","Type":"ContainerStarted","Data":"1e79e5817a3879fa9f6fea3185e2939035d1b48dae4fb0b28778b193d59f3eb3"} Apr 17 11:20:24.596599 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:24.596560 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4d5rw" event={"ID":"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3","Type":"ContainerStarted","Data":"74d5e1cd2b33cce052e63004308d0e0f0b75628032199ec688f8a94c42063a4d"} Apr 17 11:20:24.596599 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:24.596596 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4d5rw" event={"ID":"d84dc363-0ebb-4e0c-9b94-1024f80ccbb3","Type":"ContainerStarted","Data":"6e17af363516b9bebf7c59f676af81544140cf7d1ef39a869d7661d50ec7173b"} Apr 17 11:20:24.617893 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:20:24.617852 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4d5rw" podStartSLOduration=253.694026027 podStartE2EDuration="4m14.617837909s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:20:22.861540819 +0000 UTC m=+252.784395618" lastFinishedPulling="2026-04-17 11:20:23.785352701 +0000 UTC m=+253.708207500" observedRunningTime="2026-04-17 11:20:24.616306873 +0000 UTC m=+254.539161692" watchObservedRunningTime="2026-04-17 11:20:24.617837909 +0000 UTC m=+254.540692776" Apr 17 11:21:10.544028 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:21:10.543978 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/2.log" Apr 17 11:21:10.544706 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:21:10.544668 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/2.log" Apr 17 11:21:10.555271 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:21:10.555256 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:21:17.958825 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:21:17.958785 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:21:17.974328 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:21:17.974301 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:21:18.780913 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:21:18.780889 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:24:03.119774 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.119736 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-npmd2"] Apr 17 11:24:03.122966 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.122949 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" Apr 17 11:24:03.125910 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.125858 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 11:24:03.125910 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.125858 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 11:24:03.126285 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.125957 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-x9zr5\"" Apr 17 11:24:03.134574 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.134554 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-npmd2"] Apr 17 11:24:03.250398 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.250361 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvtkv\" (UniqueName: \"kubernetes.io/projected/1375a1bc-d54b-40a8-bfe4-530437a9120e-kube-api-access-jvtkv\") pod \"cert-manager-cainjector-8966b78d4-npmd2\" (UID: \"1375a1bc-d54b-40a8-bfe4-530437a9120e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" Apr 17 11:24:03.250568 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.250438 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1375a1bc-d54b-40a8-bfe4-530437a9120e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-npmd2\" (UID: \"1375a1bc-d54b-40a8-bfe4-530437a9120e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" Apr 17 11:24:03.351098 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.351063 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvtkv\" (UniqueName: \"kubernetes.io/projected/1375a1bc-d54b-40a8-bfe4-530437a9120e-kube-api-access-jvtkv\") pod \"cert-manager-cainjector-8966b78d4-npmd2\" (UID: \"1375a1bc-d54b-40a8-bfe4-530437a9120e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" Apr 17 11:24:03.351250 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.351103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1375a1bc-d54b-40a8-bfe4-530437a9120e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-npmd2\" (UID: \"1375a1bc-d54b-40a8-bfe4-530437a9120e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" Apr 17 11:24:03.359817 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.359784 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1375a1bc-d54b-40a8-bfe4-530437a9120e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-npmd2\" (UID: \"1375a1bc-d54b-40a8-bfe4-530437a9120e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" Apr 17 11:24:03.359934 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.359901 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvtkv\" (UniqueName: \"kubernetes.io/projected/1375a1bc-d54b-40a8-bfe4-530437a9120e-kube-api-access-jvtkv\") pod \"cert-manager-cainjector-8966b78d4-npmd2\" (UID: \"1375a1bc-d54b-40a8-bfe4-530437a9120e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" Apr 17 11:24:03.446713 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.446637 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" Apr 17 11:24:03.564052 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.564028 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-npmd2"] Apr 17 11:24:03.566314 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:24:03.566272 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1375a1bc_d54b_40a8_bfe4_530437a9120e.slice/crio-619ad39236e646363a164bdee75b9974f5c9bb43c2b4f0d0435fcb2547d09d77 WatchSource:0}: Error finding container 619ad39236e646363a164bdee75b9974f5c9bb43c2b4f0d0435fcb2547d09d77: Status 404 returned error can't find the container with id 619ad39236e646363a164bdee75b9974f5c9bb43c2b4f0d0435fcb2547d09d77 Apr 17 11:24:03.568357 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:03.568340 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:24:04.247402 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:04.247343 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" event={"ID":"1375a1bc-d54b-40a8-bfe4-530437a9120e","Type":"ContainerStarted","Data":"619ad39236e646363a164bdee75b9974f5c9bb43c2b4f0d0435fcb2547d09d77"} Apr 17 11:24:07.260040 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:07.259950 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" event={"ID":"1375a1bc-d54b-40a8-bfe4-530437a9120e","Type":"ContainerStarted","Data":"123abc616063d3f5028af430877c4d75b3d618fa3434a25598f4155a1ae09a1f"} Apr 17 11:24:07.277495 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:07.277451 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-npmd2" podStartSLOduration=1.038831034 podStartE2EDuration="4.277438181s" podCreationTimestamp="2026-04-17 11:24:03 +0000 UTC" firstStartedPulling="2026-04-17 11:24:03.568528064 +0000 UTC m=+473.491382860" lastFinishedPulling="2026-04-17 11:24:06.807135208 +0000 UTC m=+476.729990007" observedRunningTime="2026-04-17 11:24:07.276429613 +0000 UTC m=+477.199284432" watchObservedRunningTime="2026-04-17 11:24:07.277438181 +0000 UTC m=+477.200293000" Apr 17 11:24:08.907647 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:08.907610 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-rpvcd"] Apr 17 11:24:08.911051 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:08.911031 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:08.913764 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:08.913742 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-mctzz\"" Apr 17 11:24:08.918990 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:08.918967 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-rpvcd"] Apr 17 11:24:09.002153 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:09.002126 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9969af20-09e0-4a49-afde-d45016a0decf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-rpvcd\" (UID: \"9969af20-09e0-4a49-afde-d45016a0decf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:09.002279 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:09.002180 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4znlc\" (UniqueName: \"kubernetes.io/projected/9969af20-09e0-4a49-afde-d45016a0decf-kube-api-access-4znlc\") pod \"cert-manager-webhook-597b96b99b-rpvcd\" (UID: \"9969af20-09e0-4a49-afde-d45016a0decf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:09.103229 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:09.103202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9969af20-09e0-4a49-afde-d45016a0decf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-rpvcd\" (UID: \"9969af20-09e0-4a49-afde-d45016a0decf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:09.103344 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:09.103256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4znlc\" (UniqueName: \"kubernetes.io/projected/9969af20-09e0-4a49-afde-d45016a0decf-kube-api-access-4znlc\") pod \"cert-manager-webhook-597b96b99b-rpvcd\" (UID: \"9969af20-09e0-4a49-afde-d45016a0decf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:09.112311 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:09.112285 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9969af20-09e0-4a49-afde-d45016a0decf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-rpvcd\" (UID: \"9969af20-09e0-4a49-afde-d45016a0decf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:09.112479 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:09.112461 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4znlc\" (UniqueName: \"kubernetes.io/projected/9969af20-09e0-4a49-afde-d45016a0decf-kube-api-access-4znlc\") pod \"cert-manager-webhook-597b96b99b-rpvcd\" (UID: \"9969af20-09e0-4a49-afde-d45016a0decf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:09.220643 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:09.220573 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:09.337828 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:09.337777 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-rpvcd"] Apr 17 11:24:09.341035 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:24:09.340998 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9969af20_09e0_4a49_afde_d45016a0decf.slice/crio-eea1770ea71c8493df3680e7460eb03950dfc4d2273507f8d9c231ae155ee68b WatchSource:0}: Error finding container eea1770ea71c8493df3680e7460eb03950dfc4d2273507f8d9c231ae155ee68b: Status 404 returned error can't find the container with id eea1770ea71c8493df3680e7460eb03950dfc4d2273507f8d9c231ae155ee68b Apr 17 11:24:10.272043 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:10.272008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" event={"ID":"9969af20-09e0-4a49-afde-d45016a0decf","Type":"ContainerStarted","Data":"3e719eff29dcc10981ea4a23947738778327281eb848a605fb08c2723215e01a"} Apr 17 11:24:10.272043 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:10.272043 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" event={"ID":"9969af20-09e0-4a49-afde-d45016a0decf","Type":"ContainerStarted","Data":"eea1770ea71c8493df3680e7460eb03950dfc4d2273507f8d9c231ae155ee68b"} Apr 17 11:24:10.272504 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:10.272068 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:10.289199 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:10.289151 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" podStartSLOduration=2.289138265 podStartE2EDuration="2.289138265s" podCreationTimestamp="2026-04-17 11:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:24:10.287661385 +0000 UTC m=+480.210516204" watchObservedRunningTime="2026-04-17 11:24:10.289138265 +0000 UTC m=+480.211993083" Apr 17 11:24:16.278091 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:16.278056 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-rpvcd" Apr 17 11:24:42.506062 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.506031 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b679b657-mghcj"] Apr 17 11:24:42.509565 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.509544 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.513374 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.513351 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:24:42.514633 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.514602 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-l5tjv\"" Apr 17 11:24:42.514633 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.514624 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 11:24:42.514796 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.514628 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 11:24:42.514796 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.514675 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 11:24:42.514796 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.514605 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 11:24:42.529811 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.529787 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b679b657-mghcj"] Apr 17 11:24:42.585188 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.585165 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/523162f7-1208-467c-ad98-9d96d5c94f11-metrics-cert\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.585284 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.585198 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znnvl\" (UniqueName: \"kubernetes.io/projected/523162f7-1208-467c-ad98-9d96d5c94f11-kube-api-access-znnvl\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.585284 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.585220 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/523162f7-1208-467c-ad98-9d96d5c94f11-manager-config\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.585358 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.585306 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/523162f7-1208-467c-ad98-9d96d5c94f11-cert\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.686281 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.686244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/523162f7-1208-467c-ad98-9d96d5c94f11-cert\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.686495 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.686326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/523162f7-1208-467c-ad98-9d96d5c94f11-metrics-cert\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.686495 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.686367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znnvl\" (UniqueName: \"kubernetes.io/projected/523162f7-1208-467c-ad98-9d96d5c94f11-kube-api-access-znnvl\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.686495 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.686420 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/523162f7-1208-467c-ad98-9d96d5c94f11-manager-config\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.687034 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.687006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/523162f7-1208-467c-ad98-9d96d5c94f11-manager-config\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.688798 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.688775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/523162f7-1208-467c-ad98-9d96d5c94f11-metrics-cert\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.688908 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.688805 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/523162f7-1208-467c-ad98-9d96d5c94f11-cert\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.699933 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.699900 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znnvl\" (UniqueName: \"kubernetes.io/projected/523162f7-1208-467c-ad98-9d96d5c94f11-kube-api-access-znnvl\") pod \"lws-controller-manager-56b679b657-mghcj\" (UID: \"523162f7-1208-467c-ad98-9d96d5c94f11\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.819608 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.819579 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:42.948224 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:42.948197 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b679b657-mghcj"] Apr 17 11:24:42.949709 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:24:42.949669 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod523162f7_1208_467c_ad98_9d96d5c94f11.slice/crio-58b5997fb98e734cb75b6f4de206185264cb37a967d8feb37a4aa2afa50715e8 WatchSource:0}: Error finding container 58b5997fb98e734cb75b6f4de206185264cb37a967d8feb37a4aa2afa50715e8: Status 404 returned error can't find the container with id 58b5997fb98e734cb75b6f4de206185264cb37a967d8feb37a4aa2afa50715e8 Apr 17 11:24:43.370455 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:43.370417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" event={"ID":"523162f7-1208-467c-ad98-9d96d5c94f11","Type":"ContainerStarted","Data":"58b5997fb98e734cb75b6f4de206185264cb37a967d8feb37a4aa2afa50715e8"} Apr 17 11:24:46.382194 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:46.382160 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" event={"ID":"523162f7-1208-467c-ad98-9d96d5c94f11","Type":"ContainerStarted","Data":"4200c2b0e0e9366cc7a3e2bf5a31ac27dfe1693f367ea3c1c1afac4dedd4981a"} Apr 17 11:24:46.382552 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:46.382290 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:24:46.399309 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:46.399260 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" podStartSLOduration=1.467477341 podStartE2EDuration="4.399247266s" podCreationTimestamp="2026-04-17 11:24:42 +0000 UTC" firstStartedPulling="2026-04-17 11:24:42.951626037 +0000 UTC m=+512.874480848" lastFinishedPulling="2026-04-17 11:24:45.883395976 +0000 UTC m=+515.806250773" observedRunningTime="2026-04-17 11:24:46.398036946 +0000 UTC m=+516.320891764" watchObservedRunningTime="2026-04-17 11:24:46.399247266 +0000 UTC m=+516.322102084" Apr 17 11:24:57.388561 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:24:57.387659 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-56b679b657-mghcj" Apr 17 11:25:20.251192 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.251151 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj"] Apr 17 11:25:20.259600 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.259581 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" Apr 17 11:25:20.262864 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.262837 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 11:25:20.263238 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.263210 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 11:25:20.263325 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.263211 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-jmws8\"" Apr 17 11:25:20.263469 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.263455 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 11:25:20.272694 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.272674 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj"] Apr 17 11:25:20.416742 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.416708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6tm\" (UniqueName: \"kubernetes.io/projected/3b1d5cf9-1d08-4319-8354-727b2fd97c58-kube-api-access-dv6tm\") pod \"dns-operator-controller-manager-844548ff4c-wrptj\" (UID: \"3b1d5cf9-1d08-4319-8354-727b2fd97c58\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" Apr 17 11:25:20.517737 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.517662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6tm\" (UniqueName: \"kubernetes.io/projected/3b1d5cf9-1d08-4319-8354-727b2fd97c58-kube-api-access-dv6tm\") pod \"dns-operator-controller-manager-844548ff4c-wrptj\" (UID: \"3b1d5cf9-1d08-4319-8354-727b2fd97c58\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" Apr 17 11:25:20.527019 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.526993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6tm\" (UniqueName: \"kubernetes.io/projected/3b1d5cf9-1d08-4319-8354-727b2fd97c58-kube-api-access-dv6tm\") pod \"dns-operator-controller-manager-844548ff4c-wrptj\" (UID: \"3b1d5cf9-1d08-4319-8354-727b2fd97c58\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" Apr 17 11:25:20.569687 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.569662 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" Apr 17 11:25:20.705054 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:20.705031 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj"] Apr 17 11:25:20.707528 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:25:20.707501 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b1d5cf9_1d08_4319_8354_727b2fd97c58.slice/crio-94d9480a82e5996671a767ca2d6d58e98ff89628b3515c05ab8bb3e09e10d7b5 WatchSource:0}: Error finding container 94d9480a82e5996671a767ca2d6d58e98ff89628b3515c05ab8bb3e09e10d7b5: Status 404 returned error can't find the container with id 94d9480a82e5996671a767ca2d6d58e98ff89628b3515c05ab8bb3e09e10d7b5 Apr 17 11:25:21.499935 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:21.499904 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" event={"ID":"3b1d5cf9-1d08-4319-8354-727b2fd97c58","Type":"ContainerStarted","Data":"94d9480a82e5996671a767ca2d6d58e98ff89628b3515c05ab8bb3e09e10d7b5"} Apr 17 11:25:24.511544 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:24.511508 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" event={"ID":"3b1d5cf9-1d08-4319-8354-727b2fd97c58","Type":"ContainerStarted","Data":"0360c3213e68cd27ba831a48340e1bf6014ddac63dbf03b1ce4cae165c86d20c"} Apr 17 11:25:24.511930 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:24.511631 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" Apr 17 11:25:25.082446 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.082366 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" podStartSLOduration=2.234962942 podStartE2EDuration="5.082351952s" podCreationTimestamp="2026-04-17 11:25:20 +0000 UTC" firstStartedPulling="2026-04-17 11:25:20.709794516 +0000 UTC m=+550.632649313" lastFinishedPulling="2026-04-17 11:25:23.557183528 +0000 UTC m=+553.480038323" observedRunningTime="2026-04-17 11:25:24.536018614 +0000 UTC m=+554.458873431" watchObservedRunningTime="2026-04-17 11:25:25.082351952 +0000 UTC m=+555.005206770" Apr 17 11:25:25.082622 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.082560 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cq52j"] Apr 17 11:25:25.086074 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.086054 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" Apr 17 11:25:25.090173 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.090153 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-jrm99\"" Apr 17 11:25:25.102574 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.102545 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cq52j"] Apr 17 11:25:25.262784 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.262750 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qht\" (UniqueName: \"kubernetes.io/projected/9cfdecbe-51b1-4568-bd12-9c62e4a2da29-kube-api-access-x2qht\") pod \"authorino-operator-7587b89b76-cq52j\" (UID: \"9cfdecbe-51b1-4568-bd12-9c62e4a2da29\") " pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" Apr 17 11:25:25.364106 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.364012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qht\" (UniqueName: \"kubernetes.io/projected/9cfdecbe-51b1-4568-bd12-9c62e4a2da29-kube-api-access-x2qht\") pod \"authorino-operator-7587b89b76-cq52j\" (UID: \"9cfdecbe-51b1-4568-bd12-9c62e4a2da29\") " pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" Apr 17 11:25:25.373072 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.373047 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qht\" (UniqueName: \"kubernetes.io/projected/9cfdecbe-51b1-4568-bd12-9c62e4a2da29-kube-api-access-x2qht\") pod \"authorino-operator-7587b89b76-cq52j\" (UID: \"9cfdecbe-51b1-4568-bd12-9c62e4a2da29\") " pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" Apr 17 11:25:25.396446 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.396419 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" Apr 17 11:25:25.519214 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:25.519190 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cq52j"] Apr 17 11:25:25.521961 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:25:25.521934 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cfdecbe_51b1_4568_bd12_9c62e4a2da29.slice/crio-677ee6c08c1658ea59fab574f5c5db3493701f7e4e8a4961d50391c295f5379e WatchSource:0}: Error finding container 677ee6c08c1658ea59fab574f5c5db3493701f7e4e8a4961d50391c295f5379e: Status 404 returned error can't find the container with id 677ee6c08c1658ea59fab574f5c5db3493701f7e4e8a4961d50391c295f5379e Apr 17 11:25:26.519954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:26.519916 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" event={"ID":"9cfdecbe-51b1-4568-bd12-9c62e4a2da29","Type":"ContainerStarted","Data":"677ee6c08c1658ea59fab574f5c5db3493701f7e4e8a4961d50391c295f5379e"} Apr 17 11:25:27.525654 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:27.525622 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" event={"ID":"9cfdecbe-51b1-4568-bd12-9c62e4a2da29","Type":"ContainerStarted","Data":"bc9f96dcf1f6e650fa66645e81b6a14ce2ae455d6c76924b2072c86d79c0de3b"} Apr 17 11:25:27.526023 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:27.525706 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" Apr 17 11:25:27.550181 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:27.550134 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" podStartSLOduration=1.089788064 podStartE2EDuration="2.550120939s" podCreationTimestamp="2026-04-17 11:25:25 +0000 UTC" firstStartedPulling="2026-04-17 11:25:25.524509102 +0000 UTC m=+555.447363901" lastFinishedPulling="2026-04-17 11:25:26.984841973 +0000 UTC m=+556.907696776" observedRunningTime="2026-04-17 11:25:27.548852187 +0000 UTC m=+557.471707009" watchObservedRunningTime="2026-04-17 11:25:27.550120939 +0000 UTC m=+557.472975757" Apr 17 11:25:35.516787 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:35.516751 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wrptj" Apr 17 11:25:38.532526 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:25:38.532487 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-cq52j" Apr 17 11:26:10.569520 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:10.569493 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/2.log" Apr 17 11:26:10.571877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:10.571854 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/2.log" Apr 17 11:26:18.581614 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.581579 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jhwfx"] Apr 17 11:26:18.584822 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.584807 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:18.587412 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.587370 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 11:26:18.587525 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.587506 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-lcs5v\"" Apr 17 11:26:18.596568 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.596549 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jhwfx"] Apr 17 11:26:18.629587 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.629562 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bc95aa10-f45e-44c1-ab69-a1b3749c3412-config-file\") pod \"limitador-limitador-64c8f475fb-jhwfx\" (UID: \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:18.629693 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.629652 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlls\" (UniqueName: \"kubernetes.io/projected/bc95aa10-f45e-44c1-ab69-a1b3749c3412-kube-api-access-mmlls\") pod \"limitador-limitador-64c8f475fb-jhwfx\" (UID: \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:18.685440 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.685409 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jhwfx"] Apr 17 11:26:18.730371 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.730336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bc95aa10-f45e-44c1-ab69-a1b3749c3412-config-file\") pod \"limitador-limitador-64c8f475fb-jhwfx\" (UID: \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:18.730545 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.730446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlls\" (UniqueName: \"kubernetes.io/projected/bc95aa10-f45e-44c1-ab69-a1b3749c3412-kube-api-access-mmlls\") pod \"limitador-limitador-64c8f475fb-jhwfx\" (UID: \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:18.730917 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.730899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bc95aa10-f45e-44c1-ab69-a1b3749c3412-config-file\") pod \"limitador-limitador-64c8f475fb-jhwfx\" (UID: \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:18.739517 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.739485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlls\" (UniqueName: \"kubernetes.io/projected/bc95aa10-f45e-44c1-ab69-a1b3749c3412-kube-api-access-mmlls\") pod \"limitador-limitador-64c8f475fb-jhwfx\" (UID: \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:18.895834 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:18.895763 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:19.020079 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:19.020057 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jhwfx"] Apr 17 11:26:19.022090 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:26:19.022062 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc95aa10_f45e_44c1_ab69_a1b3749c3412.slice/crio-5a0d552f15e8d249faa988f0c78cc70458134146896da4c2e6c4bc74a2e41a3e WatchSource:0}: Error finding container 5a0d552f15e8d249faa988f0c78cc70458134146896da4c2e6c4bc74a2e41a3e: Status 404 returned error can't find the container with id 5a0d552f15e8d249faa988f0c78cc70458134146896da4c2e6c4bc74a2e41a3e Apr 17 11:26:19.716463 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:19.716417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" event={"ID":"bc95aa10-f45e-44c1-ab69-a1b3749c3412","Type":"ContainerStarted","Data":"5a0d552f15e8d249faa988f0c78cc70458134146896da4c2e6c4bc74a2e41a3e"} Apr 17 11:26:23.733458 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:23.733416 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" event={"ID":"bc95aa10-f45e-44c1-ab69-a1b3749c3412","Type":"ContainerStarted","Data":"e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b"} Apr 17 11:26:23.733835 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:23.733546 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:23.751214 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:23.751169 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" podStartSLOduration=2.090675801 podStartE2EDuration="5.751155579s" podCreationTimestamp="2026-04-17 11:26:18 +0000 UTC" firstStartedPulling="2026-04-17 11:26:19.024039683 +0000 UTC m=+608.946894480" lastFinishedPulling="2026-04-17 11:26:22.684519462 +0000 UTC m=+612.607374258" observedRunningTime="2026-04-17 11:26:23.749802752 +0000 UTC m=+613.672657570" watchObservedRunningTime="2026-04-17 11:26:23.751155579 +0000 UTC m=+613.674010396" Apr 17 11:26:34.737981 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:34.737906 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:35.105236 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.105203 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jhwfx"] Apr 17 11:26:35.105451 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.105426 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" podUID="bc95aa10-f45e-44c1-ab69-a1b3749c3412" containerName="limitador" containerID="cri-o://e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b" gracePeriod=30 Apr 17 11:26:35.649971 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.649951 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:35.774434 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.774337 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc95aa10-f45e-44c1-ab69-a1b3749c3412" containerID="e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b" exitCode=0 Apr 17 11:26:35.774434 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.774412 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" event={"ID":"bc95aa10-f45e-44c1-ab69-a1b3749c3412","Type":"ContainerDied","Data":"e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b"} Apr 17 11:26:35.774831 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.774438 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" Apr 17 11:26:35.774831 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.774461 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jhwfx" event={"ID":"bc95aa10-f45e-44c1-ab69-a1b3749c3412","Type":"ContainerDied","Data":"5a0d552f15e8d249faa988f0c78cc70458134146896da4c2e6c4bc74a2e41a3e"} Apr 17 11:26:35.774831 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.774477 2568 scope.go:117] "RemoveContainer" containerID="e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b" Apr 17 11:26:35.782204 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.782187 2568 scope.go:117] "RemoveContainer" containerID="e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b" Apr 17 11:26:35.782498 ip-10-0-142-114 kubenswrapper[2568]: E0417 11:26:35.782479 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b\": container with ID starting with e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b not found: ID does not exist" containerID="e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b" Apr 17 11:26:35.782569 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.782506 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b"} err="failed to get container status \"e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b\": rpc error: code = NotFound desc = could not find container \"e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b\": container with ID starting with e461cafe7602b2f6fe32d11da76bc0adc935ca65547ca2e133509c3dc1ff4d4b not found: ID does not exist" Apr 17 11:26:35.783759 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.783742 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bc95aa10-f45e-44c1-ab69-a1b3749c3412-config-file\") pod \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\" (UID: \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\") " Apr 17 11:26:35.783843 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.783829 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmlls\" (UniqueName: \"kubernetes.io/projected/bc95aa10-f45e-44c1-ab69-a1b3749c3412-kube-api-access-mmlls\") pod \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\" (UID: \"bc95aa10-f45e-44c1-ab69-a1b3749c3412\") " Apr 17 11:26:35.784047 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.784023 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc95aa10-f45e-44c1-ab69-a1b3749c3412-config-file" (OuterVolumeSpecName: "config-file") pod "bc95aa10-f45e-44c1-ab69-a1b3749c3412" (UID: "bc95aa10-f45e-44c1-ab69-a1b3749c3412"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:35.784204 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.784187 2568 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bc95aa10-f45e-44c1-ab69-a1b3749c3412-config-file\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:26:35.785881 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.785862 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc95aa10-f45e-44c1-ab69-a1b3749c3412-kube-api-access-mmlls" (OuterVolumeSpecName: "kube-api-access-mmlls") pod "bc95aa10-f45e-44c1-ab69-a1b3749c3412" (UID: "bc95aa10-f45e-44c1-ab69-a1b3749c3412"). InnerVolumeSpecName "kube-api-access-mmlls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:35.885538 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:35.885516 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmlls\" (UniqueName: \"kubernetes.io/projected/bc95aa10-f45e-44c1-ab69-a1b3749c3412-kube-api-access-mmlls\") on node \"ip-10-0-142-114.ec2.internal\" DevicePath \"\"" Apr 17 11:26:36.097339 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:36.097297 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jhwfx"] Apr 17 11:26:36.102448 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:36.102426 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jhwfx"] Apr 17 11:26:36.720281 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:36.720248 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc95aa10-f45e-44c1-ab69-a1b3749c3412" path="/var/lib/kubelet/pods/bc95aa10-f45e-44c1-ab69-a1b3749c3412/volumes" Apr 17 11:26:54.347244 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.347196 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2"] Apr 17 11:26:54.347655 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.347579 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc95aa10-f45e-44c1-ab69-a1b3749c3412" containerName="limitador" Apr 17 11:26:54.347655 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.347590 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc95aa10-f45e-44c1-ab69-a1b3749c3412" containerName="limitador" Apr 17 11:26:54.347730 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.347665 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc95aa10-f45e-44c1-ab69-a1b3749c3412" containerName="limitador" Apr 17 11:26:54.351947 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.351927 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.355244 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.355219 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 11:26:54.355376 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.355322 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 11:26:54.355500 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.355479 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-72qnh\"" Apr 17 11:26:54.355555 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.355499 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 11:26:54.355555 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.355547 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 11:26:54.356549 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.356529 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 11:26:54.356644 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.356547 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 11:26:54.370642 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.368908 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2"] Apr 17 11:26:54.446803 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.446757 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.446978 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.446812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.446978 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.446851 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.446978 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.446885 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aeb39f46-0617-47cd-8bed-daf1632ca8f7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.447098 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.446977 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.447098 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.447031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6zc\" (UniqueName: \"kubernetes.io/projected/aeb39f46-0617-47cd-8bed-daf1632ca8f7-kube-api-access-mt6zc\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.447098 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.447087 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.548293 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.548247 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aeb39f46-0617-47cd-8bed-daf1632ca8f7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.548498 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.548313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.548498 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.548358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6zc\" (UniqueName: \"kubernetes.io/projected/aeb39f46-0617-47cd-8bed-daf1632ca8f7-kube-api-access-mt6zc\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.548498 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.548419 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.548498 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.548490 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.548751 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.548517 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.548751 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.548547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.550419 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.550371 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.553769 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.553717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.557183 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.557159 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aeb39f46-0617-47cd-8bed-daf1632ca8f7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.558765 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.558734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.558974 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.558945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aeb39f46-0617-47cd-8bed-daf1632ca8f7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.560847 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.560815 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6zc\" (UniqueName: \"kubernetes.io/projected/aeb39f46-0617-47cd-8bed-daf1632ca8f7-kube-api-access-mt6zc\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.560949 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.560866 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aeb39f46-0617-47cd-8bed-daf1632ca8f7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wvdj2\" (UID: \"aeb39f46-0617-47cd-8bed-daf1632ca8f7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.661990 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.661910 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:54.834117 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:26:54.834086 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeb39f46_0617_47cd_8bed_daf1632ca8f7.slice/crio-5ccff8a661b42233898b5c518d3895579508e49e9f41921ad1c0083c7435dd95 WatchSource:0}: Error finding container 5ccff8a661b42233898b5c518d3895579508e49e9f41921ad1c0083c7435dd95: Status 404 returned error can't find the container with id 5ccff8a661b42233898b5c518d3895579508e49e9f41921ad1c0083c7435dd95 Apr 17 11:26:54.834477 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.834451 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2"] Apr 17 11:26:54.840151 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:54.840120 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" event={"ID":"aeb39f46-0617-47cd-8bed-daf1632ca8f7","Type":"ContainerStarted","Data":"5ccff8a661b42233898b5c518d3895579508e49e9f41921ad1c0083c7435dd95"} Apr 17 11:26:57.391942 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:57.391472 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 11:26:57.391942 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:57.391563 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 11:26:57.854183 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:57.854144 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" event={"ID":"aeb39f46-0617-47cd-8bed-daf1632ca8f7","Type":"ContainerStarted","Data":"0f773e53ce3912de5526edc68e9e50e38184f56ac9ff0b3035977a42ea292b6c"} Apr 17 11:26:57.854340 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:57.854255 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:26:57.880038 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:57.879975 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" podStartSLOduration=1.324983982 podStartE2EDuration="3.879954721s" podCreationTimestamp="2026-04-17 11:26:54 +0000 UTC" firstStartedPulling="2026-04-17 11:26:54.836173186 +0000 UTC m=+644.759027982" lastFinishedPulling="2026-04-17 11:26:57.391143908 +0000 UTC m=+647.313998721" observedRunningTime="2026-04-17 11:26:57.878821367 +0000 UTC m=+647.801676187" watchObservedRunningTime="2026-04-17 11:26:57.879954721 +0000 UTC m=+647.802809542" Apr 17 11:26:58.859235 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:26:58.859209 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wvdj2" Apr 17 11:29:06.487100 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:06.487060 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-wvdj2_aeb39f46-0617-47cd-8bed-daf1632ca8f7/discovery/0.log" Apr 17 11:29:07.256257 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:07.256223 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-wvdj2_aeb39f46-0617-47cd-8bed-daf1632ca8f7/discovery/0.log" Apr 17 11:29:08.324563 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:08.324538 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-cq52j_9cfdecbe-51b1-4568-bd12-9c62e4a2da29/manager/0.log" Apr 17 11:29:08.335332 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:08.335309 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-wrptj_3b1d5cf9-1d08-4319-8354-727b2fd97c58/manager/0.log" Apr 17 11:29:15.524038 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:15.524004 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tvkdx_4f77cfcb-60a1-4c91-8f58-dac82efa3fe4/global-pull-secret-syncer/0.log" Apr 17 11:29:15.586001 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:15.585964 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ndrlj_255d82a1-7244-4c1c-ab7b-1ad9c2d49e6f/konnectivity-agent/0.log" Apr 17 11:29:15.675017 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:15.674976 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-114.ec2.internal_12581cfe4fc807f31862128cb3a75bcb/haproxy/0.log" Apr 17 11:29:19.522812 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:19.522776 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-cq52j_9cfdecbe-51b1-4568-bd12-9c62e4a2da29/manager/0.log" Apr 17 11:29:19.557961 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:19.557929 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-wrptj_3b1d5cf9-1d08-4319-8354-727b2fd97c58/manager/0.log" Apr 17 11:29:20.727831 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.727740 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_788c2618-9fc8-4d4c-9e7d-f72f5c6b6938/alertmanager/0.log" Apr 17 11:29:20.747547 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.747520 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_788c2618-9fc8-4d4c-9e7d-f72f5c6b6938/config-reloader/0.log" Apr 17 11:29:20.766677 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.766654 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_788c2618-9fc8-4d4c-9e7d-f72f5c6b6938/kube-rbac-proxy-web/0.log" Apr 17 11:29:20.787762 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.787746 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_788c2618-9fc8-4d4c-9e7d-f72f5c6b6938/kube-rbac-proxy/0.log" Apr 17 11:29:20.807235 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.807212 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_788c2618-9fc8-4d4c-9e7d-f72f5c6b6938/kube-rbac-proxy-metric/0.log" Apr 17 11:29:20.827302 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.827281 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_788c2618-9fc8-4d4c-9e7d-f72f5c6b6938/prom-label-proxy/0.log" Apr 17 11:29:20.848471 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.848449 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_788c2618-9fc8-4d4c-9e7d-f72f5c6b6938/init-config-reloader/0.log" Apr 17 11:29:20.889680 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.889658 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-2wsvr_c3bbc74d-0a7a-4056-af5e-f1e1491bfed5/cluster-monitoring-operator/0.log" Apr 17 11:29:20.910117 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.910097 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nqq2j_78e0eced-4738-41b6-84e2-8c3d5dc008d8/kube-state-metrics/0.log" Apr 17 11:29:20.927729 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.927709 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nqq2j_78e0eced-4738-41b6-84e2-8c3d5dc008d8/kube-rbac-proxy-main/0.log" Apr 17 11:29:20.945918 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.945894 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nqq2j_78e0eced-4738-41b6-84e2-8c3d5dc008d8/kube-rbac-proxy-self/0.log" Apr 17 11:29:20.969307 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:20.969286 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-cbb9b68bb-dbb42_95f1ae67-46e1-4042-b30d-597a5948855c/metrics-server/0.log" Apr 17 11:29:21.023169 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.023147 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7cq4d_2bf7c07f-cc95-4712-aa18-c07b4d35d1a3/node-exporter/0.log" Apr 17 11:29:21.042248 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.042226 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7cq4d_2bf7c07f-cc95-4712-aa18-c07b4d35d1a3/kube-rbac-proxy/0.log" Apr 17 11:29:21.061223 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.061202 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7cq4d_2bf7c07f-cc95-4712-aa18-c07b4d35d1a3/init-textfile/0.log" Apr 17 11:29:21.247278 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.247202 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5q6cg_0f9da7be-ba93-41a4-890a-451226a11e8f/kube-rbac-proxy-main/0.log" Apr 17 11:29:21.268182 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.268159 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5q6cg_0f9da7be-ba93-41a4-890a-451226a11e8f/kube-rbac-proxy-self/0.log" Apr 17 11:29:21.287436 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.287418 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5q6cg_0f9da7be-ba93-41a4-890a-451226a11e8f/openshift-state-metrics/0.log" Apr 17 11:29:21.324207 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.324186 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9cda29a4-4b2d-4bc1-98d7-6affcd01044b/prometheus/0.log" Apr 17 11:29:21.340977 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.340949 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9cda29a4-4b2d-4bc1-98d7-6affcd01044b/config-reloader/0.log" Apr 17 11:29:21.363343 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.363322 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9cda29a4-4b2d-4bc1-98d7-6affcd01044b/thanos-sidecar/0.log" Apr 17 11:29:21.381111 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.381092 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9cda29a4-4b2d-4bc1-98d7-6affcd01044b/kube-rbac-proxy-web/0.log" Apr 17 11:29:21.401817 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.401800 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9cda29a4-4b2d-4bc1-98d7-6affcd01044b/kube-rbac-proxy/0.log" Apr 17 11:29:21.421736 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.421720 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9cda29a4-4b2d-4bc1-98d7-6affcd01044b/kube-rbac-proxy-thanos/0.log" Apr 17 11:29:21.443766 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.443745 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9cda29a4-4b2d-4bc1-98d7-6affcd01044b/init-config-reloader/0.log" Apr 17 11:29:21.472979 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.472957 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-r4hbg_ec5a98f4-eb7f-4316-8786-d7bfaf42593e/prometheus-operator/0.log" Apr 17 11:29:21.488963 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.488943 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-r4hbg_ec5a98f4-eb7f-4316-8786-d7bfaf42593e/kube-rbac-proxy/0.log" Apr 17 11:29:21.519254 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.519201 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-qc28d_c617e6a0-6c4b-443d-baf3-c104f0de1db9/prometheus-operator-admission-webhook/0.log" Apr 17 11:29:21.553621 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.553592 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bb88f555c-k7tn4_b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3/telemeter-client/0.log" Apr 17 11:29:21.593203 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.593181 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bb88f555c-k7tn4_b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3/reload/0.log" Apr 17 11:29:21.612695 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:21.612666 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bb88f555c-k7tn4_b9bd3d6c-e911-4b10-a3b3-87d27acc1dd3/kube-rbac-proxy/0.log" Apr 17 11:29:23.074877 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:23.074848 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-gv5jl_50ebaa9d-a374-4732-b57c-5cfb2b64a318/networking-console-plugin/0.log" Apr 17 11:29:23.588629 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:23.588597 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/2.log" Apr 17 11:29:23.597312 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:23.597281 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-22ddd_134e0312-09a3-4d5f-b641-3d6579587cde/console-operator/3.log" Apr 17 11:29:24.506954 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.506924 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-r5k85_c336d09a-931b-409b-a010-6bf7cb87a9a9/volume-data-source-validator/0.log" Apr 17 11:29:24.764070 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.763998 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv"] Apr 17 11:29:24.767481 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.767456 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.770563 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.770537 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2skc\"/\"kube-root-ca.crt\"" Apr 17 11:29:24.770709 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.770545 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d2skc\"/\"default-dockercfg-6vh64\"" Apr 17 11:29:24.771707 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.771682 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2skc\"/\"openshift-service-ca.crt\"" Apr 17 11:29:24.773841 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.773697 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-lib-modules\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.773841 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.773747 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-podres\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.773841 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.773797 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7d9v\" (UniqueName: \"kubernetes.io/projected/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-kube-api-access-j7d9v\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.773841 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.773830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-proc\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.774159 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.773865 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-sys\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.776483 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.776460 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv"] Apr 17 11:29:24.874749 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-lib-modules\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.874921 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-podres\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.874921 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874784 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7d9v\" (UniqueName: \"kubernetes.io/projected/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-kube-api-access-j7d9v\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.874921 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-proc\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.874921 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874822 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-sys\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.874921 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874891 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-proc\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.874921 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874892 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-podres\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.874921 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874900 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-lib-modules\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.875152 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.874930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-sys\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:24.883442 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:24.883416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7d9v\" (UniqueName: \"kubernetes.io/projected/3809e3cf-e0f3-4279-8cbd-fa1663b8adf3-kube-api-access-j7d9v\") pod \"perf-node-gather-daemonset-8njsv\" (UID: \"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:25.077811 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:25.077787 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:25.370337 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:25.370264 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hst8c_9443878d-c2b0-4771-b41e-f23e0fff86a4/dns/0.log" Apr 17 11:29:25.390122 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:25.390089 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hst8c_9443878d-c2b0-4771-b41e-f23e0fff86a4/kube-rbac-proxy/0.log" Apr 17 11:29:25.399216 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:25.399151 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv"] Apr 17 11:29:25.401670 ip-10-0-142-114 kubenswrapper[2568]: W0417 11:29:25.401639 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3809e3cf_e0f3_4279_8cbd_fa1663b8adf3.slice/crio-e68520d7c74db0d99229b7861cec7d7bf755e171a9a2078c4a63011816aedf16 WatchSource:0}: Error finding container e68520d7c74db0d99229b7861cec7d7bf755e171a9a2078c4a63011816aedf16: Status 404 returned error can't find the container with id e68520d7c74db0d99229b7861cec7d7bf755e171a9a2078c4a63011816aedf16 Apr 17 11:29:25.403415 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:25.403394 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:29:25.436260 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:25.436239 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h7hkk_61d4b955-d2fa-4cee-a5a9-5bb37d994e5f/dns-node-resolver/0.log" Apr 17 11:29:25.964950 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:25.964924 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kgk95_0ebdddb3-e6b6-4191-9db1-01e8d15cae25/node-ca/0.log" Apr 17 11:29:26.370010 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:26.369967 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" event={"ID":"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3","Type":"ContainerStarted","Data":"dbca9e8fd57a42ba8fa834ce52df575f30f57e707a66b721b694ba60de7e8b26"} Apr 17 11:29:26.370010 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:26.370015 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" event={"ID":"3809e3cf-e0f3-4279-8cbd-fa1663b8adf3","Type":"ContainerStarted","Data":"e68520d7c74db0d99229b7861cec7d7bf755e171a9a2078c4a63011816aedf16"} Apr 17 11:29:26.370230 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:26.370050 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:26.389062 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:26.389022 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" podStartSLOduration=2.389007749 podStartE2EDuration="2.389007749s" podCreationTimestamp="2026-04-17 11:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:29:26.387084053 +0000 UTC m=+796.309938871" watchObservedRunningTime="2026-04-17 11:29:26.389007749 +0000 UTC m=+796.311862567" Apr 17 11:29:26.738739 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:26.738665 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-wvdj2_aeb39f46-0617-47cd-8bed-daf1632ca8f7/discovery/0.log" Apr 17 11:29:27.240032 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:27.239995 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8jgxz_015be0d7-ff4e-4b65-b3ee-73d579ba395e/serve-healthcheck-canary/0.log" Apr 17 11:29:27.709744 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:27.709708 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8q9jb_bfdb9877-c4ed-40e3-9a4c-80fe70a2f755/insights-operator/1.log" Apr 17 11:29:27.709906 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:27.709862 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8q9jb_bfdb9877-c4ed-40e3-9a4c-80fe70a2f755/insights-operator/0.log" Apr 17 11:29:27.800268 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:27.800239 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hx7b7_46e1bfc0-432d-4c42-9d39-5eba5a87ea6a/kube-rbac-proxy/0.log" Apr 17 11:29:27.818922 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:27.818900 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hx7b7_46e1bfc0-432d-4c42-9d39-5eba5a87ea6a/exporter/0.log" Apr 17 11:29:27.839526 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:27.839504 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hx7b7_46e1bfc0-432d-4c42-9d39-5eba5a87ea6a/extractor/0.log" Apr 17 11:29:30.393413 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:30.393318 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-56b679b657-mghcj_523162f7-1208-467c-ad98-9d96d5c94f11/manager/0.log" Apr 17 11:29:32.383797 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:32.383769 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-8njsv" Apr 17 11:29:37.078771 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.078739 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mt24_9b1b3449-3e5b-448f-a69c-f6678b42b96b/kube-multus/0.log" Apr 17 11:29:37.126364 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.126334 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw5h9_a995cba3-0edd-41aa-923f-d47b9d050676/kube-multus-additional-cni-plugins/0.log" Apr 17 11:29:37.148838 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.148813 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw5h9_a995cba3-0edd-41aa-923f-d47b9d050676/egress-router-binary-copy/0.log" Apr 17 11:29:37.170580 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.170550 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw5h9_a995cba3-0edd-41aa-923f-d47b9d050676/cni-plugins/0.log" Apr 17 11:29:37.192153 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.192131 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw5h9_a995cba3-0edd-41aa-923f-d47b9d050676/bond-cni-plugin/0.log" Apr 17 11:29:37.216006 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.215963 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw5h9_a995cba3-0edd-41aa-923f-d47b9d050676/routeoverride-cni/0.log" Apr 17 11:29:37.239487 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.239463 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw5h9_a995cba3-0edd-41aa-923f-d47b9d050676/whereabouts-cni-bincopy/0.log" Apr 17 11:29:37.258617 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.258601 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw5h9_a995cba3-0edd-41aa-923f-d47b9d050676/whereabouts-cni/0.log" Apr 17 11:29:37.633183 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.633150 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4d5rw_d84dc363-0ebb-4e0c-9b94-1024f80ccbb3/network-metrics-daemon/0.log" Apr 17 11:29:37.651793 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:37.651763 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4d5rw_d84dc363-0ebb-4e0c-9b94-1024f80ccbb3/kube-rbac-proxy/0.log" Apr 17 11:29:38.808508 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:38.808472 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sps6r_f0fa0497-6cc8-4a84-b902-a5b9ad486d28/ovn-controller/0.log" Apr 17 11:29:38.834133 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:38.834105 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sps6r_f0fa0497-6cc8-4a84-b902-a5b9ad486d28/ovn-acl-logging/0.log" Apr 17 11:29:38.856449 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:38.856422 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sps6r_f0fa0497-6cc8-4a84-b902-a5b9ad486d28/kube-rbac-proxy-node/0.log" Apr 17 11:29:38.879362 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:38.879334 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sps6r_f0fa0497-6cc8-4a84-b902-a5b9ad486d28/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:29:38.894317 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:38.894295 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sps6r_f0fa0497-6cc8-4a84-b902-a5b9ad486d28/northd/0.log" Apr 17 11:29:38.915287 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:38.915262 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sps6r_f0fa0497-6cc8-4a84-b902-a5b9ad486d28/nbdb/0.log" Apr 17 11:29:38.938943 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:38.938907 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sps6r_f0fa0497-6cc8-4a84-b902-a5b9ad486d28/sbdb/0.log" Apr 17 11:29:39.150589 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:39.150517 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sps6r_f0fa0497-6cc8-4a84-b902-a5b9ad486d28/ovnkube-controller/0.log" Apr 17 11:29:40.604417 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:40.604374 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-srg6p_cb867750-66e0-49fa-b347-fa907f29bbae/network-check-target-container/0.log" Apr 17 11:29:41.644863 ip-10-0-142-114 kubenswrapper[2568]: I0417 11:29:41.644836 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rz67d_c8d89149-a1c2-4e87-941b-ce08710499d4/iptables-alerter/0.log"