Apr 23 13:31:54.196271 ip-10-0-141-22 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:31:54.594397 ip-10-0-141-22 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:54.594397 ip-10-0-141-22 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:31:54.594397 ip-10-0-141-22 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:54.594397 ip-10-0-141-22 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:31:54.594898 ip-10-0-141-22 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:54.595835 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.595773 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:31:54.602967 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.602927 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:54.603073 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603064 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:54.603073 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603072 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603076 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603079 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603083 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603088 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603090 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603094 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603097 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603099 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603102 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603105 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603108 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603110 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603113 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603116 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603119 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603122 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603124 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603127 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603129 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:54.603139 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603133 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603136 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603139 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603141 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603144 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603148 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603150 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603153 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603156 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603159 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603162 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603166 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603170 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603173 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603176 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603178 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603181 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603184 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603187 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:54.603619 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603189 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603192 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603194 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603197 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603199 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603202 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603205 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603207 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603210 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603213 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603215 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603218 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603220 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603225 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603228 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603231 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603235 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603239 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603243 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:54.604107 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603246 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603249 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603252 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603255 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603257 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603260 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603263 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603265 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603268 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603271 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603273 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603276 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603278 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603281 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603283 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603286 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603290 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603293 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603296 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603298 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:54.604567 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603301 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603304 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603307 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603309 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603312 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603315 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603709 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603715 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603719 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603722 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603725 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603729 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603733 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603736 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603739 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603742 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603745 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603748 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603750 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:54.605100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603752 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603755 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603758 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603760 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603763 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603765 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603768 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603770 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603773 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603776 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603778 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603781 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603784 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603786 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603789 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603792 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603794 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603797 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603799 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:54.605558 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603802 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603804 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603807 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603810 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603812 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603815 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603817 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603820 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603823 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603825 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603828 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603831 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603833 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603836 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603838 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603841 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603843 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603846 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603848 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603851 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:54.606064 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603853 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603858 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603861 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603865 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603868 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603871 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603874 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603876 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603879 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603881 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603884 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603886 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603889 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603891 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603894 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603897 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603900 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603903 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603905 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603908 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:54.606557 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603910 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603914 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603916 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603919 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603921 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603924 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603926 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603929 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603931 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603933 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603936 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603938 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603941 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.603943 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604670 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604680 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604686 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604691 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604696 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604699 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604704 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:31:54.607062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604709 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604712 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604715 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604718 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604722 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604725 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604727 2562 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604730 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604734 2562 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604736 2562 flags.go:64] FLAG: --cloud-config="" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604739 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604742 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604746 2562 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604749 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604752 2562 flags.go:64] FLAG: --config-dir="" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604755 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604758 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604763 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604765 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604769 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604772 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604775 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604778 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604781 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604784 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:31:54.607577 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604787 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604791 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604794 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604797 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604800 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604804 2562 flags.go:64] FLAG: --enable-server="true" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604807 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604811 2562 flags.go:64] FLAG: --event-burst="100" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604814 2562 flags.go:64] FLAG: --event-qps="50" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604817 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604820 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604824 2562 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604828 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604831 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604834 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604837 2562 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604840 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604843 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604847 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604849 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604852 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604855 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604858 2562 flags.go:64] FLAG: --feature-gates="" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604862 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604865 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:31:54.608214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604868 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604871 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604874 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604877 2562 flags.go:64] FLAG: --help="false" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604883 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-141-22.ec2.internal" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604886 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604889 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604892 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604895 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604898 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604901 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604904 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604907 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604911 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604914 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604917 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604920 2562 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604923 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604926 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604929 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604932 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604935 2562 flags.go:64] FLAG: --lock-file="" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604938 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604941 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:31:54.608812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604944 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604949 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604952 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604955 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604958 2562 flags.go:64] FLAG: --logging-format="text" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604960 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604964 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604967 2562 flags.go:64] FLAG: --manifest-url="" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604970 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604975 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604978 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604982 2562 flags.go:64] FLAG: --max-pods="110" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604987 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604990 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604993 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604996 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.604999 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605002 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605005 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605025 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605029 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605032 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605035 2562 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:31:54.609416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605039 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605044 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605047 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605050 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605053 2562 flags.go:64] FLAG: --port="10250" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605056 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605060 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08d0ae1f6fb57929a" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605063 2562 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605067 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605069 2562 flags.go:64] FLAG: --register-node="true" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605072 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605075 2562 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605079 2562 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605081 2562 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605084 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605087 2562 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605091 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605094 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605097 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605100 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605103 2562 flags.go:64] FLAG: --runonce="false" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605106 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605111 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605114 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605117 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605120 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:31:54.610057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605123 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605126 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605130 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605133 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605135 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605138 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605142 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605145 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605148 2562 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605151 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605156 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605159 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605162 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605166 2562 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605169 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605172 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605175 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605178 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605181 2562 flags.go:64] FLAG: --v="2" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605185 2562 flags.go:64] FLAG: --version="false" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605189 2562 flags.go:64] FLAG: --vmodule="" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605196 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.605200 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605294 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:54.610680 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605298 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605301 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605303 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605306 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605311 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605315 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605319 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605321 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605324 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605327 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605330 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605332 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605335 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605337 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605340 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605344 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605348 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605350 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605353 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605357 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:54.611370 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605360 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605363 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605366 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605369 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605372 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605375 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605378 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605380 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605383 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605386 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605389 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605392 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605394 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605397 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605400 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605403 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605408 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605411 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605413 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:54.611881 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605416 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605419 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605422 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605425 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605427 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605430 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605433 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605435 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605438 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605443 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605445 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605448 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605451 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605453 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605456 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605459 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605462 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605464 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605467 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605470 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:54.612368 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605473 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605475 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605478 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605480 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605483 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605486 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605488 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605491 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605493 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605497 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605500 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605503 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605505 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605508 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605510 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605513 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605516 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605518 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605521 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605523 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:54.612856 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605526 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605529 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605532 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605535 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605537 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.605540 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.606349 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.611890 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.611906 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611952 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611957 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611961 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611966 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611969 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611972 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:54.613400 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611976 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611979 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611982 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611985 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611988 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611991 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611993 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611996 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.611999 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612001 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612004 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612006 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612009 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612011 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612034 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612036 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612039 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612042 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612045 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:54.613776 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612048 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612051 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612054 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612056 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612059 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612061 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612065 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612068 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612070 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612073 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612076 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612079 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612082 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612084 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612087 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612090 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612092 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612095 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612098 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612101 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:54.614275 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612103 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612106 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612108 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612111 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612114 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612116 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612119 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612121 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612124 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612127 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612129 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612133 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612137 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612141 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612143 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612146 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612148 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612151 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612154 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612158 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:54.614766 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612160 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612163 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612165 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612168 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612171 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612173 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612176 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612179 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612181 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612184 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612186 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612189 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612191 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612194 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612197 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612199 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612202 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612204 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612207 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612209 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:54.615322 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612211 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.612216 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612308 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612312 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612315 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612318 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612321 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612324 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612327 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612329 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612333 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612336 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612339 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612341 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612344 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:54.615854 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612346 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612349 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612351 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612354 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612357 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612359 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612362 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612364 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612367 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612369 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612372 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612375 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612378 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612380 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612383 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612386 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612388 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612391 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612393 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612396 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:54.616333 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612398 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612401 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612403 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612406 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612409 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612412 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612414 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612418 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612421 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612423 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612426 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612428 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612431 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612433 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612436 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612438 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612441 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612443 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612446 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612448 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:54.616857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612451 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612453 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612456 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612458 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612461 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612464 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612467 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612470 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612472 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612475 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612477 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612480 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612482 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612484 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612487 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612490 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612492 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612501 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612504 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612507 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:54.617439 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612510 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612512 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612515 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612517 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612520 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612523 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612527 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612530 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612533 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612536 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612539 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612542 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:54.612544 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.612549 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:54.618242 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.613107 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:31:54.618633 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.614915 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:31:54.618633 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.615781 2562 server.go:1019] "Starting client certificate rotation" Apr 23 13:31:54.618633 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.615889 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:54.618633 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.615933 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:54.641720 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.641702 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:54.644454 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.644434 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:54.655478 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.655461 2562 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:31:54.660812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.660798 2562 log.go:25] "Validated CRI v1 image API" Apr 23 13:31:54.664026 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.663998 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:31:54.668144 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.668123 2562 fs.go:135] Filesystem UUIDs: map[312c522f-1c1a-4dea-bc24-c647d64cb484:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 dd7f30aa-7568-40be-bc68-5a4aaee28749:/dev/nvme0n1p4] Apr 23 13:31:54.668214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.668142 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:31:54.670777 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.670758 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:54.673803 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.673684 2562 manager.go:217] Machine: {Timestamp:2026-04-23 13:31:54.671846584 +0000 UTC m=+0.366751841 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100005 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29a74b48827f780f5a2abe7c055db1 SystemUUID:ec29a74b-4882-7f78-0f5a-2abe7c055db1 BootID:86dcb0cb-c38d-4323-a348-de818372c7ce Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e3:f5:c3:1b:a5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e3:f5:c3:1b:a5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:fa:e2:e0:c6:fd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:31:54.673803 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.673794 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:31:54.673961 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.673882 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:31:54.674918 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.674891 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:31:54.675096 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.674921 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-22.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:31:54.675174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.675108 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:31:54.675174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.675120 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:31:54.675174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.675138 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:54.675869 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.675857 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:54.677058 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.677046 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:54.677178 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.677167 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:31:54.679224 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.679213 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:31:54.679290 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.679238 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:31:54.679290 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.679254 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:31:54.679290 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.679266 2562 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:31:54.679290 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.679278 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:31:54.680319 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.680306 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:54.680380 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.680328 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:54.683455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.683434 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:31:54.684607 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.684594 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:31:54.686128 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686116 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:31:54.686171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686133 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:31:54.686171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686139 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:31:54.686171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686145 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:31:54.686171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686150 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:31:54.686171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686156 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:31:54.686171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686161 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:31:54.686171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686166 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:31:54.686171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686173 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:31:54.686380 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686179 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:31:54.686380 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686191 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:31:54.686380 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686199 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:31:54.687008 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.686999 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:31:54.687008 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.687008 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:31:54.690164 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.690143 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:31:54.690164 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.690152 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-22.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:31:54.690326 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.690307 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:31:54.690459 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.690448 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:31:54.690506 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.690482 2562 server.go:1295] "Started kubelet" Apr 23 13:31:54.690599 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.690570 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:31:54.690632 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.690575 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:31:54.691353 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.691330 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:31:54.691782 ip-10-0-141-22 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:31:54.692800 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.692710 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:31:54.693262 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.693245 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:31:54.694534 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.694513 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9vm7x" Apr 23 13:31:54.698680 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.698651 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:31:54.700282 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.700265 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9vm7x" Apr 23 13:31:54.700912 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.700074 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-22.ec2.internal.18a8ff9fc65bf9eb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-22.ec2.internal,UID:ip-10-0-141-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-22.ec2.internal,},FirstTimestamp:2026-04-23 13:31:54.690460139 +0000 UTC m=+0.385365395,LastTimestamp:2026-04-23 13:31:54.690460139 +0000 UTC m=+0.385365395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-22.ec2.internal,}" Apr 23 13:31:54.701031 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701006 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:31:54.701073 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701038 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:54.701699 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701671 2562 factory.go:55] Registering systemd factory Apr 23 13:31:54.701699 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701701 2562 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:31:54.701832 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701770 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:31:54.701832 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701771 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:31:54.701832 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701817 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:31:54.701969 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.701913 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:54.701969 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701951 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:31:54.701969 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701960 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:31:54.702144 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.701988 2562 factory.go:153] Registering CRI-O factory Apr 23 13:31:54.702144 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.702000 2562 factory.go:223] Registration of the crio container factory successfully Apr 23 13:31:54.702144 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.702065 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:31:54.702144 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.702095 2562 factory.go:103] Registering Raw factory Apr 23 13:31:54.702144 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.702110 2562 manager.go:1196] Started watching for new ooms in manager Apr 23 13:31:54.702501 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.702490 2562 manager.go:319] Starting recovery of all containers Apr 23 13:31:54.711567 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.711462 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:54.711660 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.711633 2562 manager.go:324] Recovery completed Apr 23 13:31:54.714137 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.714121 2562 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-22.ec2.internal\" not found" node="ip-10-0-141-22.ec2.internal" Apr 23 13:31:54.715766 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.715755 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:54.717900 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.717887 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:54.717963 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.717912 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:54.717963 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.717924 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:54.718377 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.718364 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:31:54.718377 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.718376 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:31:54.718458 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.718391 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:54.720275 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.720262 2562 policy_none.go:49] "None policy: Start" Apr 23 13:31:54.720328 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.720281 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:31:54.720328 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.720295 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:31:54.763558 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.763540 2562 manager.go:341] "Starting Device Plugin manager" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.763584 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.763595 2562 server.go:85] "Starting device plugin registration server" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.763797 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.763808 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.763891 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.764004 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.764027 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.764419 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:31:54.765786 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.764445 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:54.834912 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.834881 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:31:54.836098 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.836083 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:31:54.836156 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.836108 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:31:54.836156 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.836122 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:31:54.836156 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.836129 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:31:54.836294 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.836158 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:31:54.838835 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.838819 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:54.864293 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.864258 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:54.864951 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.864937 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:54.865011 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.864962 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:54.865011 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.864974 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:54.865011 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.864998 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-22.ec2.internal" Apr 23 13:31:54.870207 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.870194 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-22.ec2.internal" Apr 23 13:31:54.870272 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.870213 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-22.ec2.internal\": node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:54.893974 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.893955 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:54.937219 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.937197 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal"] Apr 23 13:31:54.937299 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.937260 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:54.938560 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.938547 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:54.938618 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.938571 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:54.938618 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.938581 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:54.939694 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.939683 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:54.939818 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.939805 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" Apr 23 13:31:54.939854 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.939846 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:54.940338 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.940325 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:54.940399 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.940339 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:54.940399 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.940349 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:54.940399 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.940360 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:54.940399 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.940371 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:54.940535 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.940362 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:54.941451 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.941437 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:54.941532 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.941460 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:54.944670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.944372 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:54.944670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.944396 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:54.944670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:54.944410 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:54.965620 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.965602 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-22.ec2.internal\" not found" node="ip-10-0-141-22.ec2.internal" Apr 23 13:31:54.969833 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.969816 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-22.ec2.internal\" not found" node="ip-10-0-141-22.ec2.internal" Apr 23 13:31:54.994267 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:54.994247 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.094941 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.094910 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.102720 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.102700 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f94592837d3c06bcac331383e3a3148d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal\" (UID: \"f94592837d3c06bcac331383e3a3148d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.102781 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.102724 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f94592837d3c06bcac331383e3a3148d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal\" (UID: \"f94592837d3c06bcac331383e3a3148d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.102781 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.102742 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dc1728671f1199487e42b58400f18934-config\") pod \"kube-apiserver-proxy-ip-10-0-141-22.ec2.internal\" (UID: \"dc1728671f1199487e42b58400f18934\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.195149 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.195119 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.203399 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.203384 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dc1728671f1199487e42b58400f18934-config\") pod \"kube-apiserver-proxy-ip-10-0-141-22.ec2.internal\" (UID: \"dc1728671f1199487e42b58400f18934\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.203474 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.203406 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f94592837d3c06bcac331383e3a3148d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal\" (UID: \"f94592837d3c06bcac331383e3a3148d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.203474 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.203424 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f94592837d3c06bcac331383e3a3148d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal\" (UID: \"f94592837d3c06bcac331383e3a3148d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.203474 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.203462 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f94592837d3c06bcac331383e3a3148d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal\" (UID: \"f94592837d3c06bcac331383e3a3148d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.203565 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.203504 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f94592837d3c06bcac331383e3a3148d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal\" (UID: \"f94592837d3c06bcac331383e3a3148d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.203565 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.203504 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dc1728671f1199487e42b58400f18934-config\") pod \"kube-apiserver-proxy-ip-10-0-141-22.ec2.internal\" (UID: \"dc1728671f1199487e42b58400f18934\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.267558 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.267538 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.271891 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.271876 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:55.295251 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.295224 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.395735 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.395704 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.496176 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.496128 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.596620 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.596600 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.616130 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.616112 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:31:55.616252 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.616236 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:55.616315 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.616267 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:55.653102 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.653083 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:55.697497 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.697478 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.701405 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.701387 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:55.702497 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.702469 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:26:54 +0000 UTC" deadline="2028-01-31 03:22:33.18403076 +0000 UTC" Apr 23 13:31:55.702545 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.702497 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15541h50m37.481536s" Apr 23 13:31:55.711569 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.711552 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:55.736228 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.736207 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zt5zf" Apr 23 13:31:55.744463 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.744442 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zt5zf" Apr 23 13:31:55.749631 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:55.749565 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf94592837d3c06bcac331383e3a3148d.slice/crio-d53d7e348ac2ddb047b3cebd93cbfbf2b6924ca469a4534bf561769ed900e395 WatchSource:0}: Error finding container d53d7e348ac2ddb047b3cebd93cbfbf2b6924ca469a4534bf561769ed900e395: Status 404 returned error can't find the container with id d53d7e348ac2ddb047b3cebd93cbfbf2b6924ca469a4534bf561769ed900e395 Apr 23 13:31:55.750083 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:55.750066 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc1728671f1199487e42b58400f18934.slice/crio-74c09ba7ef7bb6bcfffa9f9814b1894972d764f2cf8e111063e7e45ba6486d2a WatchSource:0}: Error finding container 74c09ba7ef7bb6bcfffa9f9814b1894972d764f2cf8e111063e7e45ba6486d2a: Status 404 returned error can't find the container with id 74c09ba7ef7bb6bcfffa9f9814b1894972d764f2cf8e111063e7e45ba6486d2a Apr 23 13:31:55.754372 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.754359 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:31:55.798350 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.798333 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.838423 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.838383 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" event={"ID":"dc1728671f1199487e42b58400f18934","Type":"ContainerStarted","Data":"74c09ba7ef7bb6bcfffa9f9814b1894972d764f2cf8e111063e7e45ba6486d2a"} Apr 23 13:31:55.839307 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.839287 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" event={"ID":"f94592837d3c06bcac331383e3a3148d","Type":"ContainerStarted","Data":"d53d7e348ac2ddb047b3cebd93cbfbf2b6924ca469a4534bf561769ed900e395"} Apr 23 13:31:55.898388 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:55.898366 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-22.ec2.internal\" not found" Apr 23 13:31:55.929833 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:55.929810 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:56.002077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.002033 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" Apr 23 13:31:56.013277 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.013260 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:56.014313 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.014302 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" Apr 23 13:31:56.020544 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.020528 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:56.588892 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.588863 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:56.680178 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.680150 2562 apiserver.go:52] "Watching apiserver" Apr 23 13:31:56.688458 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.688435 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:31:56.690532 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.690506 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-xgv6v","openshift-image-registry/node-ca-hc6xl","openshift-multus/multus-bjz54","openshift-multus/network-metrics-daemon-kdj59","openshift-network-operator/iptables-alerter-djgmc","openshift-dns/node-resolver-4mlb8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal","openshift-multus/multus-additional-cni-plugins-87dsw","openshift-network-diagnostics/network-check-target-xmjsn","openshift-ovn-kubernetes/ovnkube-node-bmczc","kube-system/konnectivity-agent-hv7lz","kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l"] Apr 23 13:31:56.692996 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.692972 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.693924 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.693907 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.694996 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.694970 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:56.695106 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:56.695068 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:31:56.696689 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.696053 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.696689 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.696200 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:31:56.696689 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.696548 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:31:56.697426 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.697099 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.697426 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.697144 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5h8nb\"" Apr 23 13:31:56.697803 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.697782 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.698176 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.698311 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.698368 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:56.698366 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.698547 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.698834 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.698999 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.699222 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:31:56.699400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.699318 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9whf8\"" Apr 23 13:31:56.699852 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.699508 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:31:56.699852 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.699555 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.699852 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.699649 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xc524\"" Apr 23 13:31:56.699852 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.699760 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:56.700133 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.699937 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:31:56.700133 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.699511 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:56.700133 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.700094 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:31:56.700273 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.700217 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:31:56.701296 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.700588 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kbm2f\"" Apr 23 13:31:56.701733 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.701713 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.704171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.703545 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-prq5s\"" Apr 23 13:31:56.704171 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.703726 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:56.704783 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.704764 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:31:56.705039 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.705007 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pzgzn\"" Apr 23 13:31:56.705290 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.705272 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:31:56.707423 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.707403 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:56.708822 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.708801 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.709964 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.709942 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-daemon-config\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.710070 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710000 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-multus-certs\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.710243 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710215 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-cni-bin\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.710307 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710257 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-run\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.710307 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710284 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-sys\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.710405 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710309 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.710405 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710332 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-sys-fs\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.710405 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-cni-multus\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.710405 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710384 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.710596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710422 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovnkube-script-lib\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.710596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710446 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysconfig\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.710596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710470 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-socket-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.710596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710495 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-netns\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.710596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710519 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r6b4\" (UniqueName: \"kubernetes.io/projected/3ab46110-c05e-4dde-9c0e-2a035e761a4a-kube-api-access-4r6b4\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.710596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710540 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.710596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710561 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-registration-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.710596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710583 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ab46110-c05e-4dde-9c0e-2a035e761a4a-cni-binary-copy\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710606 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710631 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710653 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb69ff46-2dee-4fb6-ad10-74074668a10f-tmp-dir\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710675 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-tuned\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710696 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-hostroot\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710723 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-etc-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710752 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-cnibin\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710774 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-slash\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710827 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-var-lib-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710855 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-modprobe-d\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710899 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-os-release\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.710949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710933 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-kubelet\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710955 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d5b9a202-5bbf-447f-828a-8504cdc5749e-agent-certs\") pod \"konnectivity-agent-hv7lz\" (UID: \"d5b9a202-5bbf-447f-828a-8504cdc5749e\") " pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.710980 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb5tf\" (UniqueName: \"kubernetes.io/projected/eb69ff46-2dee-4fb6-ad10-74074668a10f-kube-api-access-bb5tf\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711008 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-run-netns\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711150 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovnkube-config\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711174 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-systemd\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711198 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf9rm\" (UniqueName: \"kubernetes.io/projected/cbc3279c-8519-4d64-887e-5441f89c8b3d-kube-api-access-rf9rm\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-device-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711260 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711292 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711286 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxnsc\" (UniqueName: \"kubernetes.io/projected/db736963-96fd-4537-b82e-5a28f2543a84-kube-api-access-jxnsc\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711330 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d5b9a202-5bbf-447f-828a-8504cdc5749e-konnectivity-ca\") pod \"konnectivity-agent-hv7lz\" (UID: \"d5b9a202-5bbf-447f-828a-8504cdc5749e\") " pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-systemd\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711407 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711454 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb69ff46-2dee-4fb6-ad10-74074668a10f-hosts-file\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711483 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-os-release\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711509 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-node-log\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.711538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711535 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysctl-conf\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.712269 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711556 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-var-lib-kubelet\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.712269 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711578 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-host\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.712269 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711920 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:31:56.712269 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.712184 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.712547 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.712521 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-d7plh\"" Apr 23 13:31:56.712862 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.712526 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:31:56.713120 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713094 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-cni-binary-copy\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.713206 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.711580 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:31:56.713206 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713194 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-socket-dir-parent\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.713313 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713221 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-lib-modules\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.713313 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713245 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-k8s-cni-cncf-io\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.713313 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713274 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-log-socket\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.713313 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713297 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbc3279c-8519-4d64-887e-5441f89c8b3d-tmp\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.713498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713321 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-iptables-alerter-script\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.713498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713344 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-host-slash\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.713498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713366 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-cni-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.713498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713388 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-etc-kubernetes\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.713498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713411 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.713498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713446 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-ovn\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.713498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713469 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:56.713498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713492 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzcb4\" (UniqueName: \"kubernetes.io/projected/8372a373-96b3-40a7-a175-86077c4b2030-kube-api-access-rzcb4\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713516 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-cni-bin\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713540 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-cnibin\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713566 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-systemd-units\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713591 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-cni-netd\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713615 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-kubernetes\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713658 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysctl-d\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713699 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-system-cni-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713726 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-system-cni-dir\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713757 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkdp9\" (UniqueName: \"kubernetes.io/projected/3d337b45-35c2-42c7-a28e-1498d3ec882d-kube-api-access-kkdp9\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713780 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58xvm\" (UniqueName: \"kubernetes.io/projected/f86b7cae-79f2-41e1-8969-860e35e415e9-kube-api-access-58xvm\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713804 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.713861 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713827 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-kubelet\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.714423 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713884 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-env-overrides\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.714423 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713907 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovn-node-metrics-cert\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.714423 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713930 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cvgj\" (UniqueName: \"kubernetes.io/projected/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-kube-api-access-7cvgj\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.714423 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.713975 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-conf-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.714423 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.714378 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:31:56.715280 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.715095 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:31:56.715518 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.715417 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:31:56.715585 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.715537 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2frzk\"" Apr 23 13:31:56.716105 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.716031 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5h8nj\"" Apr 23 13:31:56.717174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.717124 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:31:56.717350 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.717334 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:31:56.745169 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.745135 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:55 +0000 UTC" deadline="2027-12-25 08:58:08.624814428 +0000 UTC" Apr 23 13:31:56.745169 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.745162 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14659h26m11.879656015s" Apr 23 13:31:56.803398 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.803378 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:31:56.815092 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815064 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-cni-multus\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.815194 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815148 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.815194 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815181 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovnkube-script-lib\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.815308 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815213 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysconfig\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.815308 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815244 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-socket-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.815308 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815276 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-host\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.815455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815319 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-netns\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.815455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815345 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r6b4\" (UniqueName: \"kubernetes.io/projected/3ab46110-c05e-4dde-9c0e-2a035e761a4a-kube-api-access-4r6b4\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.815455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815374 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.815455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815406 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-registration-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.815455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815437 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpxz4\" (UniqueName: \"kubernetes.io/projected/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-kube-api-access-fpxz4\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.815647 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815467 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ab46110-c05e-4dde-9c0e-2a035e761a4a-cni-binary-copy\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.815647 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.815647 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815534 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:31:56.815647 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815566 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb69ff46-2dee-4fb6-ad10-74074668a10f-tmp-dir\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.815647 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815596 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-tuned\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.815647 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815623 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-hostroot\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815650 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-etc-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815678 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-cnibin\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815708 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-slash\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815738 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-var-lib-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815767 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-modprobe-d\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815798 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-os-release\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815822 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-kubelet\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815854 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d5b9a202-5bbf-447f-828a-8504cdc5749e-agent-certs\") pod \"konnectivity-agent-hv7lz\" (UID: \"d5b9a202-5bbf-447f-828a-8504cdc5749e\") " pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:31:56.815910 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815884 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bb5tf\" (UniqueName: \"kubernetes.io/projected/eb69ff46-2dee-4fb6-ad10-74074668a10f-kube-api-access-bb5tf\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815915 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-run-netns\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815944 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovnkube-config\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.815969 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-systemd\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816001 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rf9rm\" (UniqueName: \"kubernetes.io/projected/cbc3279c-8519-4d64-887e-5441f89c8b3d-kube-api-access-rf9rm\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816046 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-device-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816078 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816110 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxnsc\" (UniqueName: \"kubernetes.io/projected/db736963-96fd-4537-b82e-5a28f2543a84-kube-api-access-jxnsc\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816141 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d5b9a202-5bbf-447f-828a-8504cdc5749e-konnectivity-ca\") pod \"konnectivity-agent-hv7lz\" (UID: \"d5b9a202-5bbf-447f-828a-8504cdc5749e\") " pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816171 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-systemd\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816227 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816203 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816235 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb69ff46-2dee-4fb6-ad10-74074668a10f-hosts-file\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816266 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-os-release\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816291 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-node-log\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816324 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysctl-conf\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816354 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-var-lib-kubelet\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816383 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-slash\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816434 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-var-lib-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816471 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-host\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816501 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-device-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816524 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-cni-multus\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.816603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816575 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816625 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.816994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816716 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-modprobe-d\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.816994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816782 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-os-release\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.816994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816787 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816840 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-systemd\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.816994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816930 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb69ff46-2dee-4fb6-ad10-74074668a10f-hosts-file\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.817196 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816998 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-os-release\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.817196 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817074 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-run-netns\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.817196 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817126 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-node-log\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.817288 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817251 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysctl-conf\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.817288 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817268 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovnkube-script-lib\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.817355 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817310 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-var-lib-kubelet\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.817396 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817381 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-kubelet\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.817443 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817411 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ab46110-c05e-4dde-9c0e-2a035e761a4a-cni-binary-copy\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.817477 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817453 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-netns\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.817555 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817537 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovnkube-config\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.817669 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817626 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-systemd\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.817705 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817694 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.817746 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817731 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:31:56.817780 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817770 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-registration-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.817816 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817796 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.817866 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817850 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysconfig\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.817920 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817906 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-hostroot\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.817953 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817919 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-socket-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.818026 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.817987 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-etc-openvswitch\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.818507 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.816386 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-host\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.818623 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818535 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-cni-binary-copy\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.818623 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d5b9a202-5bbf-447f-828a-8504cdc5749e-konnectivity-ca\") pod \"konnectivity-agent-hv7lz\" (UID: \"d5b9a202-5bbf-447f-828a-8504cdc5749e\") " pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:31:56.818623 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818117 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-cnibin\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.818804 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818739 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb69ff46-2dee-4fb6-ad10-74074668a10f-tmp-dir\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.818879 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818848 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-socket-dir-parent\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.818939 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818889 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-lib-modules\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.818939 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818925 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-k8s-cni-cncf-io\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.819050 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818952 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-log-socket\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.819050 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.818985 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbc3279c-8519-4d64-887e-5441f89c8b3d-tmp\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.819050 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819037 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-iptables-alerter-script\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.819217 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819071 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-host-slash\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.819217 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819102 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-cni-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.819217 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819133 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-etc-kubernetes\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.819217 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819160 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.819217 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819197 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-ovn\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.819445 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819229 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:56.819445 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819265 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzcb4\" (UniqueName: \"kubernetes.io/projected/8372a373-96b3-40a7-a175-86077c4b2030-kube-api-access-rzcb4\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:56.819445 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819300 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-cni-bin\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.819445 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819362 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-var-lib-cni-bin\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.819445 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819376 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-cni-binary-copy\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.819445 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819420 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-host-slash\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.819716 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819475 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-log-socket\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.819716 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819511 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.819716 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819520 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-lib-modules\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.819716 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819524 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-k8s-cni-cncf-io\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.819716 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819601 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-socket-dir-parent\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.819716 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819603 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-run-ovn\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.819716 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819710 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-cni-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.820036 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819780 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-iptables-alerter-script\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.820036 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819794 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-etc-kubernetes\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.820036 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819784 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-cnibin\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.820036 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819837 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-cnibin\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.820036 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:56.819876 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:56.820036 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819948 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-systemd-units\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.820286 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.819900 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-systemd-units\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.820682 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:56.820651 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs podName:8372a373-96b3-40a7-a175-86077c4b2030 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:57.319963783 +0000 UTC m=+3.014869047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs") pod "network-metrics-daemon-kdj59" (UID: "8372a373-96b3-40a7-a175-86077c4b2030") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:56.820769 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820709 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-cni-netd\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.820769 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820736 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-kubernetes\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.820769 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820759 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysctl-d\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.820927 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820782 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-system-cni-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.820927 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820806 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-system-cni-dir\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.820927 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820830 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkdp9\" (UniqueName: \"kubernetes.io/projected/3d337b45-35c2-42c7-a28e-1498d3ec882d-kube-api-access-kkdp9\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.820927 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820854 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58xvm\" (UniqueName: \"kubernetes.io/projected/f86b7cae-79f2-41e1-8969-860e35e415e9-kube-api-access-58xvm\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.820927 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820881 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-serviceca\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.820927 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820907 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820934 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-kubelet\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820959 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-env-overrides\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.820982 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovn-node-metrics-cert\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821003 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cvgj\" (UniqueName: \"kubernetes.io/projected/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-kube-api-access-7cvgj\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821042 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-conf-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821062 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-daemon-config\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821082 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-multus-certs\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821102 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-cni-bin\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821122 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-run\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821145 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-sys\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821166 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821210 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-sys-fs\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.821329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821299 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-sys-fs\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.822174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821343 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-cni-netd\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.822174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-kubernetes\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.822174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821480 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-sysctl-d\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.822174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821526 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-system-cni-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.822174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.821564 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db736963-96fd-4537-b82e-5a28f2543a84-system-cni-dir\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.822481 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822458 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/db736963-96fd-4537-b82e-5a28f2543a84-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.822581 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822544 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-host-run-multus-certs\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.822631 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822617 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-cni-bin\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.822678 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822665 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-run\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.822749 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822724 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbc3279c-8519-4d64-887e-5441f89c8b3d-sys\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.822806 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822776 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f86b7cae-79f2-41e1-8969-860e35e415e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.822806 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822786 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d5b9a202-5bbf-447f-828a-8504cdc5749e-agent-certs\") pod \"konnectivity-agent-hv7lz\" (UID: \"d5b9a202-5bbf-447f-828a-8504cdc5749e\") " pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:31:56.822806 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822788 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-daemon-config\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.822950 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822827 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d337b45-35c2-42c7-a28e-1498d3ec882d-host-kubelet\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.823008 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.822957 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbc3279c-8519-4d64-887e-5441f89c8b3d-tmp\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.823332 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.823311 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d337b45-35c2-42c7-a28e-1498d3ec882d-env-overrides\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.823409 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.823375 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ab46110-c05e-4dde-9c0e-2a035e761a4a-multus-conf-dir\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.823525 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.823494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cbc3279c-8519-4d64-887e-5441f89c8b3d-etc-tuned\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.823939 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:56.823920 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:56.824231 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:56.824135 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:56.824231 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:56.824153 2562 projected.go:194] Error preparing data for projected volume kube-api-access-7k7rr for pod openshift-network-diagnostics/network-check-target-xmjsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:56.824231 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:56.824208 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr podName:2a1fbae2-2df7-41eb-9ed9-aac09b5af692 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:57.324189846 +0000 UTC m=+3.019095110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7k7rr" (UniqueName: "kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr") pod "network-check-target-xmjsn" (UID: "2a1fbae2-2df7-41eb-9ed9-aac09b5af692") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:56.824896 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.824769 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxnsc\" (UniqueName: \"kubernetes.io/projected/db736963-96fd-4537-b82e-5a28f2543a84-kube-api-access-jxnsc\") pod \"multus-additional-cni-plugins-87dsw\" (UID: \"db736963-96fd-4537-b82e-5a28f2543a84\") " pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:56.825561 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.825523 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r6b4\" (UniqueName: \"kubernetes.io/projected/3ab46110-c05e-4dde-9c0e-2a035e761a4a-kube-api-access-4r6b4\") pod \"multus-bjz54\" (UID: \"3ab46110-c05e-4dde-9c0e-2a035e761a4a\") " pod="openshift-multus/multus-bjz54" Apr 23 13:31:56.825911 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.825870 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d337b45-35c2-42c7-a28e-1498d3ec882d-ovn-node-metrics-cert\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.825911 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.825900 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb5tf\" (UniqueName: \"kubernetes.io/projected/eb69ff46-2dee-4fb6-ad10-74074668a10f-kube-api-access-bb5tf\") pod \"node-resolver-4mlb8\" (UID: \"eb69ff46-2dee-4fb6-ad10-74074668a10f\") " pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:56.827214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.827197 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf9rm\" (UniqueName: \"kubernetes.io/projected/cbc3279c-8519-4d64-887e-5441f89c8b3d-kube-api-access-rf9rm\") pod \"tuned-xgv6v\" (UID: \"cbc3279c-8519-4d64-887e-5441f89c8b3d\") " pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:56.827383 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.827362 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzcb4\" (UniqueName: \"kubernetes.io/projected/8372a373-96b3-40a7-a175-86077c4b2030-kube-api-access-rzcb4\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:56.834617 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.834579 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58xvm\" (UniqueName: \"kubernetes.io/projected/f86b7cae-79f2-41e1-8969-860e35e415e9-kube-api-access-58xvm\") pod \"aws-ebs-csi-driver-node-dmm6l\" (UID: \"f86b7cae-79f2-41e1-8969-860e35e415e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:56.835879 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.835831 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cvgj\" (UniqueName: \"kubernetes.io/projected/ce8caef2-3b61-4f26-ab2b-c0770c8d1569-kube-api-access-7cvgj\") pod \"iptables-alerter-djgmc\" (UID: \"ce8caef2-3b61-4f26-ab2b-c0770c8d1569\") " pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:56.836456 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.836431 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkdp9\" (UniqueName: \"kubernetes.io/projected/3d337b45-35c2-42c7-a28e-1498d3ec882d-kube-api-access-kkdp9\") pod \"ovnkube-node-bmczc\" (UID: \"3d337b45-35c2-42c7-a28e-1498d3ec882d\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:56.921994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.921966 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-serviceca\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.922130 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.922029 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-host\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.922130 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.922058 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpxz4\" (UniqueName: \"kubernetes.io/projected/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-kube-api-access-fpxz4\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.922219 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.922138 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-host\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.922433 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.922415 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-serviceca\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:56.930272 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:56.930250 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpxz4\" (UniqueName: \"kubernetes.io/projected/30d3d180-9aae-49e3-9c8b-f13ce3df5f68-kube-api-access-fpxz4\") pod \"node-ca-hc6xl\" (UID: \"30d3d180-9aae-49e3-9c8b-f13ce3df5f68\") " pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:57.010273 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.010250 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:31:57.021779 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.021753 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bjz54" Apr 23 13:31:57.029393 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.029376 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:31:57.036843 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.036825 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-djgmc" Apr 23 13:31:57.046408 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.046389 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4mlb8" Apr 23 13:31:57.052992 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.052972 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" Apr 23 13:31:57.059539 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.059522 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-87dsw" Apr 23 13:31:57.069063 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.069046 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" Apr 23 13:31:57.073477 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.073461 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hc6xl" Apr 23 13:31:57.201964 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.201937 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:57.325370 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.325343 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:57.325514 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.325405 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:31:57.325514 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:57.325491 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:57.325637 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:57.325532 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:57.325637 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:57.325545 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:57.325637 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:57.325553 2562 projected.go:194] Error preparing data for projected volume kube-api-access-7k7rr for pod openshift-network-diagnostics/network-check-target-xmjsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:57.325637 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:57.325559 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs podName:8372a373-96b3-40a7-a175-86077c4b2030 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:58.325539525 +0000 UTC m=+4.020444784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs") pod "network-metrics-daemon-kdj59" (UID: "8372a373-96b3-40a7-a175-86077c4b2030") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:57.325637 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:57.325593 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr podName:2a1fbae2-2df7-41eb-9ed9-aac09b5af692 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:58.32557792 +0000 UTC m=+4.020483186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7k7rr" (UniqueName: "kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr") pod "network-check-target-xmjsn" (UID: "2a1fbae2-2df7-41eb-9ed9-aac09b5af692") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:57.383875 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:57.383847 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ab46110_c05e_4dde_9c0e_2a035e761a4a.slice/crio-1cb30d6949a120cae19a972f3c142328768048f49b14c8242f840b376998f752 WatchSource:0}: Error finding container 1cb30d6949a120cae19a972f3c142328768048f49b14c8242f840b376998f752: Status 404 returned error can't find the container with id 1cb30d6949a120cae19a972f3c142328768048f49b14c8242f840b376998f752 Apr 23 13:31:57.386266 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:57.386234 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb69ff46_2dee_4fb6_ad10_74074668a10f.slice/crio-842192a8145a5e083628e4295af0cbd8aa7c3db0b6512a5322667e6a1a3e18e6 WatchSource:0}: Error finding container 842192a8145a5e083628e4295af0cbd8aa7c3db0b6512a5322667e6a1a3e18e6: Status 404 returned error can't find the container with id 842192a8145a5e083628e4295af0cbd8aa7c3db0b6512a5322667e6a1a3e18e6 Apr 23 13:31:57.389292 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:57.389267 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb736963_96fd_4537_b82e_5a28f2543a84.slice/crio-1832e54a6cc4ec666dfba153308cff8312843bbafb8b3927f90c62e625e3038d WatchSource:0}: Error finding container 1832e54a6cc4ec666dfba153308cff8312843bbafb8b3927f90c62e625e3038d: Status 404 returned error can't find the container with id 1832e54a6cc4ec666dfba153308cff8312843bbafb8b3927f90c62e625e3038d Apr 23 13:31:57.390180 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:57.390159 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf86b7cae_79f2_41e1_8969_860e35e415e9.slice/crio-8078267fb49d44d2f87127b76c5bfae527d18a46c4a35ee5ab13e986a82196a8 WatchSource:0}: Error finding container 8078267fb49d44d2f87127b76c5bfae527d18a46c4a35ee5ab13e986a82196a8: Status 404 returned error can't find the container with id 8078267fb49d44d2f87127b76c5bfae527d18a46c4a35ee5ab13e986a82196a8 Apr 23 13:31:57.390908 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:57.390732 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc3279c_8519_4d64_887e_5441f89c8b3d.slice/crio-d5e56de27b6adbbd45d77869bc3461f5d813c8a4d290b3cc29aecdfeb89ab4f7 WatchSource:0}: Error finding container d5e56de27b6adbbd45d77869bc3461f5d813c8a4d290b3cc29aecdfeb89ab4f7: Status 404 returned error can't find the container with id d5e56de27b6adbbd45d77869bc3461f5d813c8a4d290b3cc29aecdfeb89ab4f7 Apr 23 13:31:57.392201 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:31:57.392178 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d337b45_35c2_42c7_a28e_1498d3ec882d.slice/crio-b60762446a158cedd0f58501f848e8311e1a05593960c12fdf920291add072aa WatchSource:0}: Error finding container b60762446a158cedd0f58501f848e8311e1a05593960c12fdf920291add072aa: Status 404 returned error can't find the container with id b60762446a158cedd0f58501f848e8311e1a05593960c12fdf920291add072aa Apr 23 13:31:57.746415 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.746169 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:55 +0000 UTC" deadline="2028-01-09 21:36:48.2730212 +0000 UTC" Apr 23 13:31:57.746415 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.746381 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15032h4m50.526644142s" Apr 23 13:31:57.842510 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.842458 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"b60762446a158cedd0f58501f848e8311e1a05593960c12fdf920291add072aa"} Apr 23 13:31:57.844455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.844422 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87dsw" event={"ID":"db736963-96fd-4537-b82e-5a28f2543a84","Type":"ContainerStarted","Data":"1832e54a6cc4ec666dfba153308cff8312843bbafb8b3927f90c62e625e3038d"} Apr 23 13:31:57.848543 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.848516 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4mlb8" event={"ID":"eb69ff46-2dee-4fb6-ad10-74074668a10f","Type":"ContainerStarted","Data":"842192a8145a5e083628e4295af0cbd8aa7c3db0b6512a5322667e6a1a3e18e6"} Apr 23 13:31:57.850673 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.850647 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjz54" event={"ID":"3ab46110-c05e-4dde-9c0e-2a035e761a4a","Type":"ContainerStarted","Data":"1cb30d6949a120cae19a972f3c142328768048f49b14c8242f840b376998f752"} Apr 23 13:31:57.854757 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.854732 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" event={"ID":"dc1728671f1199487e42b58400f18934","Type":"ContainerStarted","Data":"90e07b51b9b12787366a17faa5db5f60d3ee4a562a1bd326673e3bfc261b732d"} Apr 23 13:31:57.859123 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.858945 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" event={"ID":"cbc3279c-8519-4d64-887e-5441f89c8b3d","Type":"ContainerStarted","Data":"d5e56de27b6adbbd45d77869bc3461f5d813c8a4d290b3cc29aecdfeb89ab4f7"} Apr 23 13:31:57.861133 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.861102 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" event={"ID":"f86b7cae-79f2-41e1-8969-860e35e415e9","Type":"ContainerStarted","Data":"8078267fb49d44d2f87127b76c5bfae527d18a46c4a35ee5ab13e986a82196a8"} Apr 23 13:31:57.863347 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.863198 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-djgmc" event={"ID":"ce8caef2-3b61-4f26-ab2b-c0770c8d1569","Type":"ContainerStarted","Data":"06ceb720477a267ca78fe9b2ff83c966d13b3a500406894c86dff713e7cdc152"} Apr 23 13:31:57.865854 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.865797 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hv7lz" event={"ID":"d5b9a202-5bbf-447f-828a-8504cdc5749e","Type":"ContainerStarted","Data":"1136c1ad33ff14e5a7622312723e3b5b3fa20e68c451c7ab4bd60289804e248a"} Apr 23 13:31:57.869494 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.869457 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hc6xl" event={"ID":"30d3d180-9aae-49e3-9c8b-f13ce3df5f68","Type":"ContainerStarted","Data":"4e331d134b1d5e169cd59514b1ac18747785058e6e24e248ea6c7591d83e2b53"} Apr 23 13:31:57.871267 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:57.870946 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-22.ec2.internal" podStartSLOduration=1.870931795 podStartE2EDuration="1.870931795s" podCreationTimestamp="2026-04-23 13:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:57.870081633 +0000 UTC m=+3.564986902" watchObservedRunningTime="2026-04-23 13:31:57.870931795 +0000 UTC m=+3.565837062" Apr 23 13:31:58.332730 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:58.332649 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:31:58.332730 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:58.332711 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:58.332942 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:58.332818 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:58.332942 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:58.332874 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs podName:8372a373-96b3-40a7-a175-86077c4b2030 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:00.332857298 +0000 UTC m=+6.027762559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs") pod "network-metrics-daemon-kdj59" (UID: "8372a373-96b3-40a7-a175-86077c4b2030") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:58.333313 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:58.333290 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:58.333426 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:58.333316 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:58.333426 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:58.333329 2562 projected.go:194] Error preparing data for projected volume kube-api-access-7k7rr for pod openshift-network-diagnostics/network-check-target-xmjsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:58.333426 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:58.333373 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr podName:2a1fbae2-2df7-41eb-9ed9-aac09b5af692 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:00.333358236 +0000 UTC m=+6.028263479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7k7rr" (UniqueName: "kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr") pod "network-check-target-xmjsn" (UID: "2a1fbae2-2df7-41eb-9ed9-aac09b5af692") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:58.837874 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:58.837159 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:31:58.837874 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:58.837291 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:31:58.837874 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:58.837712 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:31:58.837874 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:31:58.837804 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:31:58.875037 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:58.874666 2562 generic.go:358] "Generic (PLEG): container finished" podID="f94592837d3c06bcac331383e3a3148d" containerID="29b2f1a8a3de2fd8da749732f709fce0b59e9f90a1ccff1f60cad716fede2a4b" exitCode=0 Apr 23 13:31:58.875892 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:58.875848 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" event={"ID":"f94592837d3c06bcac331383e3a3148d","Type":"ContainerDied","Data":"29b2f1a8a3de2fd8da749732f709fce0b59e9f90a1ccff1f60cad716fede2a4b"} Apr 23 13:31:59.884851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:31:59.884814 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" event={"ID":"f94592837d3c06bcac331383e3a3148d","Type":"ContainerStarted","Data":"aa83e912665b5a795b12f7f66ce4412b92392806b72796c9fdf8fdfa154f4261"} Apr 23 13:32:00.348843 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:00.348814 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:00.349010 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:00.348862 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:00.349010 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:00.348974 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:00.349010 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:00.348995 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:00.349202 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:00.349036 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:00.349202 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:00.349050 2562 projected.go:194] Error preparing data for projected volume kube-api-access-7k7rr for pod openshift-network-diagnostics/network-check-target-xmjsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:00.349202 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:00.349053 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs podName:8372a373-96b3-40a7-a175-86077c4b2030 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:04.349034325 +0000 UTC m=+10.043939576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs") pod "network-metrics-daemon-kdj59" (UID: "8372a373-96b3-40a7-a175-86077c4b2030") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:00.349202 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:00.349102 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr podName:2a1fbae2-2df7-41eb-9ed9-aac09b5af692 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:04.349085076 +0000 UTC m=+10.043990329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7k7rr" (UniqueName: "kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr") pod "network-check-target-xmjsn" (UID: "2a1fbae2-2df7-41eb-9ed9-aac09b5af692") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:00.836863 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:00.836833 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:00.837059 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:00.836833 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:00.837059 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:00.836970 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:00.837195 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:00.837086 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:02.836735 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:02.836694 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:02.836735 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:02.836725 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:02.837328 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:02.836826 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:02.837328 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:02.836967 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:04.380225 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:04.380193 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:04.380671 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:04.380253 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:04.380671 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:04.380393 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:04.380671 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:04.380413 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:04.380671 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:04.380426 2562 projected.go:194] Error preparing data for projected volume kube-api-access-7k7rr for pod openshift-network-diagnostics/network-check-target-xmjsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:04.380671 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:04.380481 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr podName:2a1fbae2-2df7-41eb-9ed9-aac09b5af692 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:12.380461344 +0000 UTC m=+18.075366605 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7k7rr" (UniqueName: "kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr") pod "network-check-target-xmjsn" (UID: "2a1fbae2-2df7-41eb-9ed9-aac09b5af692") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:04.380918 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:04.380871 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:04.380978 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:04.380924 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs podName:8372a373-96b3-40a7-a175-86077c4b2030 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:12.380908206 +0000 UTC m=+18.075813456 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs") pod "network-metrics-daemon-kdj59" (UID: "8372a373-96b3-40a7-a175-86077c4b2030") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:04.838455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:04.837801 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:04.838455 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:04.837916 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:04.838455 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:04.838309 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:04.838455 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:04.838410 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:06.836446 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:06.836419 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:06.836879 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:06.836419 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:06.836879 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:06.836544 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:06.836879 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:06.836604 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:08.837115 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:08.837084 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:08.837573 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:08.837083 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:08.837573 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:08.837200 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:08.837573 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:08.837297 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:10.837002 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:10.836964 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:10.837460 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:10.836964 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:10.837460 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:10.837123 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:10.837460 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:10.837274 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:12.438303 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:12.438269 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:12.438736 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:12.438333 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:12.438736 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:12.438406 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:12.438736 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:12.438483 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs podName:8372a373-96b3-40a7-a175-86077c4b2030 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:28.438463473 +0000 UTC m=+34.133368718 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs") pod "network-metrics-daemon-kdj59" (UID: "8372a373-96b3-40a7-a175-86077c4b2030") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:12.438736 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:12.438493 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:12.438736 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:12.438513 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:12.438736 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:12.438528 2562 projected.go:194] Error preparing data for projected volume kube-api-access-7k7rr for pod openshift-network-diagnostics/network-check-target-xmjsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:12.438736 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:12.438649 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr podName:2a1fbae2-2df7-41eb-9ed9-aac09b5af692 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:28.438630917 +0000 UTC m=+34.133536176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7k7rr" (UniqueName: "kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr") pod "network-check-target-xmjsn" (UID: "2a1fbae2-2df7-41eb-9ed9-aac09b5af692") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:12.836630 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:12.836556 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:12.836789 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:12.836557 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:12.836789 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:12.836690 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:12.836789 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:12.836755 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:14.837828 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.837613 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:14.838366 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.837672 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:14.838366 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:14.837934 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:14.838366 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:14.838038 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:14.909437 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.909405 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjz54" event={"ID":"3ab46110-c05e-4dde-9c0e-2a035e761a4a","Type":"ContainerStarted","Data":"be59b1fec1bf46b0f33f3785c3a309f6d2c291bde560819ded9797fea7e4b5d1"} Apr 23 13:32:14.910887 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.910859 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" event={"ID":"cbc3279c-8519-4d64-887e-5441f89c8b3d","Type":"ContainerStarted","Data":"149390cb9bdfa4fe813d442ca0e5f3949d2edd51c3d35d3cc755bd34ba4e6d41"} Apr 23 13:32:14.912389 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.912364 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" event={"ID":"f86b7cae-79f2-41e1-8969-860e35e415e9","Type":"ContainerStarted","Data":"7497957ae2e42a421354c7ddefa55474fb600286a92884fba3aead25568eadd5"} Apr 23 13:32:14.914134 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.914095 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hv7lz" event={"ID":"d5b9a202-5bbf-447f-828a-8504cdc5749e","Type":"ContainerStarted","Data":"004ab94fd67924a6a26524c289d8e6a0aedc4e31bddf3e7ec27ecd876649f3f9"} Apr 23 13:32:14.916659 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.916637 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hc6xl" event={"ID":"30d3d180-9aae-49e3-9c8b-f13ce3df5f68","Type":"ContainerStarted","Data":"48abf840713d5775b0a2220ce34b6fba0c51cb2b46cde483fc418888d776b9a2"} Apr 23 13:32:14.919096 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.919076 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:32:14.919394 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.919372 2562 generic.go:358] "Generic (PLEG): container finished" podID="3d337b45-35c2-42c7-a28e-1498d3ec882d" containerID="3745dcd0f75b350f489b95061a81947b3f6e533d56f484dd73fb0c5f0f6e6c5e" exitCode=1 Apr 23 13:32:14.919474 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.919434 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"9d0d05f38eefeab19f3dd5d7f2e68f857892b66aff41a2cbf0ed9ff5bab5db4a"} Apr 23 13:32:14.919474 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.919452 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"25be9a1c6779f279953c92538ec54b845ed2d7a51dbab952d6376dff41308139"} Apr 23 13:32:14.919474 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.919461 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"5be0913730195182b08431412e20bf3516575bc81000db475fb8f33cff444e21"} Apr 23 13:32:14.919474 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.919469 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"861c8d275b7736ee1a3b32555fe8a4808692baa75101b3e6d4383578058b4f6b"} Apr 23 13:32:14.919644 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.919478 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerDied","Data":"3745dcd0f75b350f489b95061a81947b3f6e533d56f484dd73fb0c5f0f6e6c5e"} Apr 23 13:32:14.919644 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.919492 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"eb48779de8b25965c22d5e6c4f7de89293fc5a6fda4ad74a38b9556d4306f291"} Apr 23 13:32:14.920692 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.920672 2562 generic.go:358] "Generic (PLEG): container finished" podID="db736963-96fd-4537-b82e-5a28f2543a84" containerID="42ac9c1d5d675b232c1468f880db2fdd5196e0019300997e587aa3ff65ef1694" exitCode=0 Apr 23 13:32:14.920775 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.920699 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87dsw" event={"ID":"db736963-96fd-4537-b82e-5a28f2543a84","Type":"ContainerDied","Data":"42ac9c1d5d675b232c1468f880db2fdd5196e0019300997e587aa3ff65ef1694"} Apr 23 13:32:14.922096 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.922062 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4mlb8" event={"ID":"eb69ff46-2dee-4fb6-ad10-74074668a10f","Type":"ContainerStarted","Data":"145648cf5c65a77dcdd157ffa98913eb451f694004d53d231885364f27508ed6"} Apr 23 13:32:14.935301 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.935258 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-22.ec2.internal" podStartSLOduration=18.935245167 podStartE2EDuration="18.935245167s" podCreationTimestamp="2026-04-23 13:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:59.900339023 +0000 UTC m=+5.595244287" watchObservedRunningTime="2026-04-23 13:32:14.935245167 +0000 UTC m=+20.630150435" Apr 23 13:32:14.935989 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.935956 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bjz54" podStartSLOduration=4.198240585 podStartE2EDuration="20.935946514s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.38629529 +0000 UTC m=+3.081200538" lastFinishedPulling="2026-04-23 13:32:14.124001222 +0000 UTC m=+19.818906467" observedRunningTime="2026-04-23 13:32:14.935058664 +0000 UTC m=+20.629963941" watchObservedRunningTime="2026-04-23 13:32:14.935946514 +0000 UTC m=+20.630851784" Apr 23 13:32:14.952218 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.952176 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xgv6v" podStartSLOduration=4.24960239 podStartE2EDuration="20.952162657s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.392443036 +0000 UTC m=+3.087348280" lastFinishedPulling="2026-04-23 13:32:14.095003301 +0000 UTC m=+19.789908547" observedRunningTime="2026-04-23 13:32:14.951911825 +0000 UTC m=+20.646817092" watchObservedRunningTime="2026-04-23 13:32:14.952162657 +0000 UTC m=+20.647067924" Apr 23 13:32:14.967838 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:14.967797 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hc6xl" podStartSLOduration=3.270293767 podStartE2EDuration="19.967783093s" podCreationTimestamp="2026-04-23 13:31:55 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.396303176 +0000 UTC m=+3.091208420" lastFinishedPulling="2026-04-23 13:32:14.093792491 +0000 UTC m=+19.788697746" observedRunningTime="2026-04-23 13:32:14.967357576 +0000 UTC m=+20.662262846" watchObservedRunningTime="2026-04-23 13:32:14.967783093 +0000 UTC m=+20.662688363" Apr 23 13:32:15.020088 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.018714 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4mlb8" podStartSLOduration=4.312306158 podStartE2EDuration="21.018698071s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.388412863 +0000 UTC m=+3.083318107" lastFinishedPulling="2026-04-23 13:32:14.094804773 +0000 UTC m=+19.789710020" observedRunningTime="2026-04-23 13:32:14.987603276 +0000 UTC m=+20.682508552" watchObservedRunningTime="2026-04-23 13:32:15.018698071 +0000 UTC m=+20.713603339" Apr 23 13:32:15.041203 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.041120 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hv7lz" podStartSLOduration=12.27735852 podStartE2EDuration="21.041104867s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.397061591 +0000 UTC m=+3.091966841" lastFinishedPulling="2026-04-23 13:32:06.160807935 +0000 UTC m=+11.855713188" observedRunningTime="2026-04-23 13:32:15.040556365 +0000 UTC m=+20.735461632" watchObservedRunningTime="2026-04-23 13:32:15.041104867 +0000 UTC m=+20.736010134" Apr 23 13:32:15.133177 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.133155 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hc6xl_30d3d180-9aae-49e3-9c8b-f13ce3df5f68/node-ca/0.log" Apr 23 13:32:15.829189 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.829150 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:32:15.925977 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.925942 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" event={"ID":"f86b7cae-79f2-41e1-8969-860e35e415e9","Type":"ContainerStarted","Data":"12fa644d99be089903eddf48b9227967a7bd65c126ea8a696c90a40ae180d39a"} Apr 23 13:32:15.927459 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.927427 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-djgmc" event={"ID":"ce8caef2-3b61-4f26-ab2b-c0770c8d1569","Type":"ContainerStarted","Data":"c83ed2f0109f92689f530fa26cdac1144f935645a177dce30e58bc9be8c9bc6f"} Apr 23 13:32:15.952892 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.952814 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-djgmc" podStartSLOduration=5.256996435 podStartE2EDuration="21.952801254s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.397543146 +0000 UTC m=+3.092448393" lastFinishedPulling="2026-04-23 13:32:14.093347956 +0000 UTC m=+19.788253212" observedRunningTime="2026-04-23 13:32:15.952441392 +0000 UTC m=+21.647346659" watchObservedRunningTime="2026-04-23 13:32:15.952801254 +0000 UTC m=+21.647706530" Apr 23 13:32:15.984136 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.984107 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:32:15.984751 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:15.984734 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:32:16.774377 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.774270 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:32:15.829171593Z","UUID":"3901140c-5bf0-441e-96e6-a7651d77d42f","Handler":null,"Name":"","Endpoint":""} Apr 23 13:32:16.778192 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.778170 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:32:16.778301 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.778213 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:32:16.836818 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.836797 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:16.836966 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:16.836928 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:16.836966 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.836956 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:16.837136 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:16.837068 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:16.883698 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.883670 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:32:16.884236 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.884214 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hv7lz" Apr 23 13:32:16.930715 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.930680 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" event={"ID":"f86b7cae-79f2-41e1-8969-860e35e415e9","Type":"ContainerStarted","Data":"61a624aa6fccb20b7c5da10ecfd7cc6afb831e9b2c860fbc5eb8908b499ec646"} Apr 23 13:32:16.948871 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:16.948822 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmm6l" podStartSLOduration=3.625808121 podStartE2EDuration="22.948808792s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.39208339 +0000 UTC m=+3.086988633" lastFinishedPulling="2026-04-23 13:32:16.71508406 +0000 UTC m=+22.409989304" observedRunningTime="2026-04-23 13:32:16.948459835 +0000 UTC m=+22.643365101" watchObservedRunningTime="2026-04-23 13:32:16.948808792 +0000 UTC m=+22.643714052" Apr 23 13:32:17.934807 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:17.934668 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:32:17.935217 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:17.935157 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"a0a04e0f22d8a097ca214d1e0f3ac0b648ef173d6bbf22a7fd3f9b2b8db8c135"} Apr 23 13:32:18.836776 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:18.836741 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:18.836941 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:18.836740 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:18.836941 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:18.836870 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:18.837063 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:18.836955 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:19.941977 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:19.941812 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:32:19.943751 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:19.942204 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"4978d564023a57e1a6e234eb1973cee52e2dc632157033e38523f829f9749dd3"} Apr 23 13:32:19.943751 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:19.942488 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:32:19.943751 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:19.942679 2562 scope.go:117] "RemoveContainer" containerID="3745dcd0f75b350f489b95061a81947b3f6e533d56f484dd73fb0c5f0f6e6c5e" Apr 23 13:32:19.957380 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:19.957361 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:32:20.837287 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.837053 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:20.837408 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.837073 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:20.837408 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:20.837376 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:20.837520 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:20.837465 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:20.947549 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.947526 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:32:20.947949 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.947926 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" event={"ID":"3d337b45-35c2-42c7-a28e-1498d3ec882d","Type":"ContainerStarted","Data":"72e6de7b2844430c8aed40fd60860fb74766a22aaae073c878a1e2de00f1cfeb"} Apr 23 13:32:20.948291 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.948229 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:32:20.948291 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.948254 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:32:20.952933 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.952910 2562 generic.go:358] "Generic (PLEG): container finished" podID="db736963-96fd-4537-b82e-5a28f2543a84" containerID="4a636f564e8d34c7a4fa06832efbd8c84dbd0688c08716edd236f97bd0e0e629" exitCode=0 Apr 23 13:32:20.953048 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.952944 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87dsw" event={"ID":"db736963-96fd-4537-b82e-5a28f2543a84","Type":"ContainerDied","Data":"4a636f564e8d34c7a4fa06832efbd8c84dbd0688c08716edd236f97bd0e0e629"} Apr 23 13:32:20.964784 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.964756 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:32:20.978996 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:20.978946 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" podStartSLOduration=10.230118031 podStartE2EDuration="26.978935841s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.394432534 +0000 UTC m=+3.089337778" lastFinishedPulling="2026-04-23 13:32:14.143250343 +0000 UTC m=+19.838155588" observedRunningTime="2026-04-23 13:32:20.978612861 +0000 UTC m=+26.673518140" watchObservedRunningTime="2026-04-23 13:32:20.978935841 +0000 UTC m=+26.673841138" Apr 23 13:32:21.956796 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:21.956758 2562 generic.go:358] "Generic (PLEG): container finished" podID="db736963-96fd-4537-b82e-5a28f2543a84" containerID="46e25195ca102ffd3baf216d447d7534943b9834472edfb93216c77f93b88508" exitCode=0 Apr 23 13:32:21.957209 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:21.956846 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87dsw" event={"ID":"db736963-96fd-4537-b82e-5a28f2543a84","Type":"ContainerDied","Data":"46e25195ca102ffd3baf216d447d7534943b9834472edfb93216c77f93b88508"} Apr 23 13:32:22.837250 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:22.837169 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:22.837412 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:22.837168 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:22.837412 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:22.837280 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:22.837412 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:22.837342 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:22.959962 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:22.959931 2562 generic.go:358] "Generic (PLEG): container finished" podID="db736963-96fd-4537-b82e-5a28f2543a84" containerID="e3edfb9ff75ed38bde1d7429fc190f0e17f1e0600046722f38029a14eae79c07" exitCode=0 Apr 23 13:32:22.960392 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:22.960026 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87dsw" event={"ID":"db736963-96fd-4537-b82e-5a28f2543a84","Type":"ContainerDied","Data":"e3edfb9ff75ed38bde1d7429fc190f0e17f1e0600046722f38029a14eae79c07"} Apr 23 13:32:24.837556 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:24.837527 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:24.837991 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:24.837620 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:24.837991 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:24.837668 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:24.837991 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:24.837710 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:26.837166 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:26.837133 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:26.837607 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:26.837172 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:26.837607 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:26.837261 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:26.837607 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:26.837402 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:28.462300 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:28.462269 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:28.462728 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:28.462316 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:28.462728 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:28.462412 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:28.462728 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:28.462433 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:28.462728 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:28.462445 2562 projected.go:194] Error preparing data for projected volume kube-api-access-7k7rr for pod openshift-network-diagnostics/network-check-target-xmjsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:28.462728 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:28.462410 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:28.462728 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:28.462487 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr podName:2a1fbae2-2df7-41eb-9ed9-aac09b5af692 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:00.462473188 +0000 UTC m=+66.157378432 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7k7rr" (UniqueName: "kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr") pod "network-check-target-xmjsn" (UID: "2a1fbae2-2df7-41eb-9ed9-aac09b5af692") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:28.462728 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:28.462543 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs podName:8372a373-96b3-40a7-a175-86077c4b2030 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:00.46252949 +0000 UTC m=+66.157434734 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs") pod "network-metrics-daemon-kdj59" (UID: "8372a373-96b3-40a7-a175-86077c4b2030") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:28.837289 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:28.837230 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:28.837289 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:28.837266 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:28.837438 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:28.837354 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:28.837474 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:28.837461 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:29.974147 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:29.974117 2562 generic.go:358] "Generic (PLEG): container finished" podID="db736963-96fd-4537-b82e-5a28f2543a84" containerID="daccd4ea71f7b75f89915958b0d4defad6f37e7006cbcd2f417d1bf68ff1a27a" exitCode=0 Apr 23 13:32:29.974525 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:29.974162 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87dsw" event={"ID":"db736963-96fd-4537-b82e-5a28f2543a84","Type":"ContainerDied","Data":"daccd4ea71f7b75f89915958b0d4defad6f37e7006cbcd2f417d1bf68ff1a27a"} Apr 23 13:32:30.837214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:30.837180 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:30.837405 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:30.837183 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:30.837405 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:30.837301 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:30.837405 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:30.837376 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:30.979053 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:30.979003 2562 generic.go:358] "Generic (PLEG): container finished" podID="db736963-96fd-4537-b82e-5a28f2543a84" containerID="40c92d9ac71edcc5c22731725875a3b117d19a608672797aec0f8994211e8243" exitCode=0 Apr 23 13:32:30.979053 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:30.979046 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87dsw" event={"ID":"db736963-96fd-4537-b82e-5a28f2543a84","Type":"ContainerDied","Data":"40c92d9ac71edcc5c22731725875a3b117d19a608672797aec0f8994211e8243"} Apr 23 13:32:31.982944 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:31.982905 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87dsw" event={"ID":"db736963-96fd-4537-b82e-5a28f2543a84","Type":"ContainerStarted","Data":"b4aab30cf85e5858b5ba4d4b5cca917f9ae5f10076325b3a742a13e07c7e8986"} Apr 23 13:32:32.011221 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:32.011180 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-87dsw" podStartSLOduration=6.504740824 podStartE2EDuration="38.011168093s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:31:57.391170181 +0000 UTC m=+3.086075427" lastFinishedPulling="2026-04-23 13:32:28.89759744 +0000 UTC m=+34.592502696" observedRunningTime="2026-04-23 13:32:32.010875617 +0000 UTC m=+37.705780884" watchObservedRunningTime="2026-04-23 13:32:32.011168093 +0000 UTC m=+37.706073359" Apr 23 13:32:32.836894 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:32.836862 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:32.837106 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:32.836863 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:32.837106 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:32.836972 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:32.837106 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:32.837045 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:34.837840 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:34.837811 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:34.838277 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:34.837903 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:34.838277 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:34.837992 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:34.838277 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:34.838116 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:36.837322 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:36.837280 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:36.837322 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:36.837294 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:36.837999 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:36.837385 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:36.837999 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:36.837506 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:38.836792 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:38.836761 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:38.837179 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:38.836761 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:38.837179 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:38.836864 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:38.837179 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:38.836917 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:39.807050 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:39.806066 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kdj59"] Apr 23 13:32:39.807050 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:39.806214 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:39.807050 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:39.806328 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:39.808264 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:39.808242 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xmjsn"] Apr 23 13:32:39.808401 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:39.808383 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:39.808508 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:39.808485 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:41.836707 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:41.836675 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:41.837369 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:41.836682 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:41.837369 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:41.836862 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:41.837369 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:41.836760 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:43.836640 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:43.836600 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:43.837206 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:43.836762 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdj59" podUID="8372a373-96b3-40a7-a175-86077c4b2030" Apr 23 13:32:43.837206 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:43.836800 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:43.837206 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:43.836881 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmjsn" podUID="2a1fbae2-2df7-41eb-9ed9-aac09b5af692" Apr 23 13:32:44.099534 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.099474 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-22.ec2.internal" event="NodeReady" Apr 23 13:32:44.099647 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.099607 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:44.158404 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.158379 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9b4c5"] Apr 23 13:32:44.176197 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.176164 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wb5pq"] Apr 23 13:32:44.176346 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.176329 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.179141 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.179117 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:44.179270 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.179172 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4t8gv\"" Apr 23 13:32:44.179270 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.179178 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:44.188074 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.188011 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9b4c5"] Apr 23 13:32:44.188213 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.188186 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.189097 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.189077 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wb5pq"] Apr 23 13:32:44.191824 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.191809 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:32:44.191914 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.191811 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:32:44.191975 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.191941 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-785x5\"" Apr 23 13:32:44.192107 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.192088 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:32:44.192223 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.192157 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:32:44.263687 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.263661 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ljjs8"] Apr 23 13:32:44.273353 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.273330 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3af26c35-b8fa-492e-89df-4d39fa887de9-metrics-tls\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.273437 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.273358 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hpqf\" (UniqueName: \"kubernetes.io/projected/3af26c35-b8fa-492e-89df-4d39fa887de9-kube-api-access-6hpqf\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.273437 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.273378 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af26c35-b8fa-492e-89df-4d39fa887de9-config-volume\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.273504 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.273445 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3af26c35-b8fa-492e-89df-4d39fa887de9-tmp-dir\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.280392 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.280374 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ljjs8"] Apr 23 13:32:44.280469 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.280449 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ljjs8" Apr 23 13:32:44.283303 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.283282 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4c4k9\"" Apr 23 13:32:44.283303 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.283301 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:44.283454 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.283323 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:44.283600 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.283587 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:44.374222 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374175 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cb4ae027-c824-44ff-bf3c-22c33097e46b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.374222 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374208 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3af26c35-b8fa-492e-89df-4d39fa887de9-metrics-tls\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.374366 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374226 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hpqf\" (UniqueName: \"kubernetes.io/projected/3af26c35-b8fa-492e-89df-4d39fa887de9-kube-api-access-6hpqf\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.374366 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374246 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af26c35-b8fa-492e-89df-4d39fa887de9-config-volume\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.374366 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374270 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cb4ae027-c824-44ff-bf3c-22c33097e46b-data-volume\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.374366 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374309 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cb4ae027-c824-44ff-bf3c-22c33097e46b-crio-socket\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.374366 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374331 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcrs6\" (UniqueName: \"kubernetes.io/projected/cb4ae027-c824-44ff-bf3c-22c33097e46b-kube-api-access-bcrs6\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.374366 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374361 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cb4ae027-c824-44ff-bf3c-22c33097e46b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.374603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374407 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3af26c35-b8fa-492e-89df-4d39fa887de9-tmp-dir\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.374676 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374661 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3af26c35-b8fa-492e-89df-4d39fa887de9-tmp-dir\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.374807 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.374791 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af26c35-b8fa-492e-89df-4d39fa887de9-config-volume\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.378518 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.378500 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3af26c35-b8fa-492e-89df-4d39fa887de9-metrics-tls\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.394772 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.394748 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hpqf\" (UniqueName: \"kubernetes.io/projected/3af26c35-b8fa-492e-89df-4d39fa887de9-kube-api-access-6hpqf\") pod \"dns-default-9b4c5\" (UID: \"3af26c35-b8fa-492e-89df-4d39fa887de9\") " pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.474680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.474654 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cb4ae027-c824-44ff-bf3c-22c33097e46b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.474780 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.474708 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cb4ae027-c824-44ff-bf3c-22c33097e46b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.474780 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.474750 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cb4ae027-c824-44ff-bf3c-22c33097e46b-data-volume\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.474874 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.474788 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce-cert\") pod \"ingress-canary-ljjs8\" (UID: \"0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce\") " pod="openshift-ingress-canary/ingress-canary-ljjs8" Apr 23 13:32:44.474874 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.474811 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lkz4\" (UniqueName: \"kubernetes.io/projected/0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce-kube-api-access-6lkz4\") pod \"ingress-canary-ljjs8\" (UID: \"0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce\") " pod="openshift-ingress-canary/ingress-canary-ljjs8" Apr 23 13:32:44.474976 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.474868 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cb4ae027-c824-44ff-bf3c-22c33097e46b-crio-socket\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.474976 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.474908 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcrs6\" (UniqueName: \"kubernetes.io/projected/cb4ae027-c824-44ff-bf3c-22c33097e46b-kube-api-access-bcrs6\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.475101 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.475035 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cb4ae027-c824-44ff-bf3c-22c33097e46b-crio-socket\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.475177 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.475159 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cb4ae027-c824-44ff-bf3c-22c33097e46b-data-volume\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.475311 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.475293 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cb4ae027-c824-44ff-bf3c-22c33097e46b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.476696 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.476681 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cb4ae027-c824-44ff-bf3c-22c33097e46b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.482859 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.482835 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcrs6\" (UniqueName: \"kubernetes.io/projected/cb4ae027-c824-44ff-bf3c-22c33097e46b-kube-api-access-bcrs6\") pod \"insights-runtime-extractor-wb5pq\" (UID: \"cb4ae027-c824-44ff-bf3c-22c33097e46b\") " pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.486797 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.486775 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:44.496416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.496398 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wb5pq" Apr 23 13:32:44.582079 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.581205 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce-cert\") pod \"ingress-canary-ljjs8\" (UID: \"0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce\") " pod="openshift-ingress-canary/ingress-canary-ljjs8" Apr 23 13:32:44.582079 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.581255 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lkz4\" (UniqueName: \"kubernetes.io/projected/0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce-kube-api-access-6lkz4\") pod \"ingress-canary-ljjs8\" (UID: \"0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce\") " pod="openshift-ingress-canary/ingress-canary-ljjs8" Apr 23 13:32:44.592761 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.592704 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce-cert\") pod \"ingress-canary-ljjs8\" (UID: \"0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce\") " pod="openshift-ingress-canary/ingress-canary-ljjs8" Apr 23 13:32:44.595066 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.594996 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lkz4\" (UniqueName: \"kubernetes.io/projected/0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce-kube-api-access-6lkz4\") pod \"ingress-canary-ljjs8\" (UID: \"0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce\") " pod="openshift-ingress-canary/ingress-canary-ljjs8" Apr 23 13:32:44.660498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.660476 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wb5pq"] Apr 23 13:32:44.662644 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.662624 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9b4c5"] Apr 23 13:32:44.665283 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:44.665259 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af26c35_b8fa_492e_89df_4d39fa887de9.slice/crio-33bbdb7eb02a86464e9d0fe14759547ab017c711189d22a6982e424e542761a3 WatchSource:0}: Error finding container 33bbdb7eb02a86464e9d0fe14759547ab017c711189d22a6982e424e542761a3: Status 404 returned error can't find the container with id 33bbdb7eb02a86464e9d0fe14759547ab017c711189d22a6982e424e542761a3 Apr 23 13:32:44.797368 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.797347 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5ntk5"] Apr 23 13:32:44.803461 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.803445 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:44.805866 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.805847 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 13:32:44.805975 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.805945 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 13:32:44.805975 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.805948 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:32:44.806282 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.806267 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:32:44.806330 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.806267 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:32:44.806366 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.806352 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-6fjbs\"" Apr 23 13:32:44.808637 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.808619 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5ntk5"] Apr 23 13:32:44.888257 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.888233 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ljjs8" Apr 23 13:32:44.983475 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.983448 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:44.983596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.983505 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:44.983596 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.983566 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62n8f\" (UniqueName: \"kubernetes.io/projected/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-kube-api-access-62n8f\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:44.983684 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.983601 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:44.997158 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:44.997130 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ljjs8"] Apr 23 13:32:45.004235 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:45.004211 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c46498b_5ca7_4562_8e0f_dc82cd5bb6ce.slice/crio-bac04c30e6c0fb08f439a01d5a4193b47497e7c840887f292782680f53ad42ee WatchSource:0}: Error finding container bac04c30e6c0fb08f439a01d5a4193b47497e7c840887f292782680f53ad42ee: Status 404 returned error can't find the container with id bac04c30e6c0fb08f439a01d5a4193b47497e7c840887f292782680f53ad42ee Apr 23 13:32:45.007046 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.007003 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wb5pq" event={"ID":"cb4ae027-c824-44ff-bf3c-22c33097e46b","Type":"ContainerStarted","Data":"79e9e1ae4e85912c925d333d4355c6dffc513c0196bbe29116af1637ef3cea76"} Apr 23 13:32:45.007149 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.007058 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wb5pq" event={"ID":"cb4ae027-c824-44ff-bf3c-22c33097e46b","Type":"ContainerStarted","Data":"ddd394b1007e18249a63617ee248faada921a06abc9e4b8512dd2d7a97c68d23"} Apr 23 13:32:45.007977 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.007924 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9b4c5" event={"ID":"3af26c35-b8fa-492e-89df-4d39fa887de9","Type":"ContainerStarted","Data":"33bbdb7eb02a86464e9d0fe14759547ab017c711189d22a6982e424e542761a3"} Apr 23 13:32:45.084028 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.083999 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:45.084118 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.084059 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:45.084118 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.084086 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62n8f\" (UniqueName: \"kubernetes.io/projected/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-kube-api-access-62n8f\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:45.084118 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.084110 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:45.084223 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:45.084208 2562 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 13:32:45.084281 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:45.084270 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls podName:b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb nodeName:}" failed. No retries permitted until 2026-04-23 13:32:45.584252175 +0000 UTC m=+51.279157431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-5ntk5" (UID: "b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb") : secret "prometheus-operator-tls" not found Apr 23 13:32:45.084727 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.084702 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:45.086989 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.086968 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:45.092465 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.092429 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62n8f\" (UniqueName: \"kubernetes.io/projected/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-kube-api-access-62n8f\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:45.588181 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.588158 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:45.588274 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:45.588261 2562 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 13:32:45.588319 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:45.588311 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls podName:b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.588296384 +0000 UTC m=+52.283201629 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-5ntk5" (UID: "b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb") : secret "prometheus-operator-tls" not found Apr 23 13:32:45.836361 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.836294 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:32:45.836498 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.836294 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:32:45.839329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.839304 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:45.840665 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.840518 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:45.840665 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.840548 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:45.840665 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.840597 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfgql\"" Apr 23 13:32:45.840665 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:45.840548 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6f6d\"" Apr 23 13:32:46.011304 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:46.011271 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ljjs8" event={"ID":"0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce","Type":"ContainerStarted","Data":"bac04c30e6c0fb08f439a01d5a4193b47497e7c840887f292782680f53ad42ee"} Apr 23 13:32:46.013467 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:46.013425 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wb5pq" event={"ID":"cb4ae027-c824-44ff-bf3c-22c33097e46b","Type":"ContainerStarted","Data":"86c00f92301ecb7a972bc493fcd240e2f96ef0dc394c9f2023c91044193b6edc"} Apr 23 13:32:46.595562 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:46.595513 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:46.598941 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:46.598913 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5ntk5\" (UID: \"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:46.615713 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:46.615689 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" Apr 23 13:32:47.754310 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:47.754157 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5ntk5"] Apr 23 13:32:47.757123 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:47.757094 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81ccbe5_04d0_48c5_aa20_f3c2386ffeeb.slice/crio-50c0cdbe566e15168086fd7841c7387ca1301b3ce8a86bc07f20af8bbe522032 WatchSource:0}: Error finding container 50c0cdbe566e15168086fd7841c7387ca1301b3ce8a86bc07f20af8bbe522032: Status 404 returned error can't find the container with id 50c0cdbe566e15168086fd7841c7387ca1301b3ce8a86bc07f20af8bbe522032 Apr 23 13:32:48.020073 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.020039 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wb5pq" event={"ID":"cb4ae027-c824-44ff-bf3c-22c33097e46b","Type":"ContainerStarted","Data":"304ffc8b204a3da03d54d1a232c82a8c79f0049f5bf8298e47fdf83bb8328933"} Apr 23 13:32:48.021096 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.021068 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" event={"ID":"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb","Type":"ContainerStarted","Data":"50c0cdbe566e15168086fd7841c7387ca1301b3ce8a86bc07f20af8bbe522032"} Apr 23 13:32:48.022484 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.022463 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9b4c5" event={"ID":"3af26c35-b8fa-492e-89df-4d39fa887de9","Type":"ContainerStarted","Data":"9dd251c561faae500a8597c1047575881b2466a79107aed4154ee7960909356d"} Apr 23 13:32:48.022600 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.022488 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9b4c5" event={"ID":"3af26c35-b8fa-492e-89df-4d39fa887de9","Type":"ContainerStarted","Data":"312bfa948cea12166a9f9866be5f944d2096675341ecaa700720a2f633b27930"} Apr 23 13:32:48.022600 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.022580 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:48.023680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.023658 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ljjs8" event={"ID":"0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce","Type":"ContainerStarted","Data":"404814e7a82bf5e9895551d323a1bc61b301988d06c714aa877067fc64e15666"} Apr 23 13:32:48.037145 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.037100 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wb5pq" podStartSLOduration=1.175774441 podStartE2EDuration="4.037088078s" podCreationTimestamp="2026-04-23 13:32:44 +0000 UTC" firstStartedPulling="2026-04-23 13:32:44.766644403 +0000 UTC m=+50.461549651" lastFinishedPulling="2026-04-23 13:32:47.627958034 +0000 UTC m=+53.322863288" observedRunningTime="2026-04-23 13:32:48.036397668 +0000 UTC m=+53.731302945" watchObservedRunningTime="2026-04-23 13:32:48.037088078 +0000 UTC m=+53.731993345" Apr 23 13:32:48.052340 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.052304 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ljjs8" podStartSLOduration=1.435845305 podStartE2EDuration="4.052294076s" podCreationTimestamp="2026-04-23 13:32:44 +0000 UTC" firstStartedPulling="2026-04-23 13:32:45.00767176 +0000 UTC m=+50.702577005" lastFinishedPulling="2026-04-23 13:32:47.624120506 +0000 UTC m=+53.319025776" observedRunningTime="2026-04-23 13:32:48.0516086 +0000 UTC m=+53.746513866" watchObservedRunningTime="2026-04-23 13:32:48.052294076 +0000 UTC m=+53.747199343" Apr 23 13:32:48.074874 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:48.074840 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9b4c5" podStartSLOduration=1.31788018 podStartE2EDuration="4.074826909s" podCreationTimestamp="2026-04-23 13:32:44 +0000 UTC" firstStartedPulling="2026-04-23 13:32:44.667054587 +0000 UTC m=+50.361959830" lastFinishedPulling="2026-04-23 13:32:47.424001312 +0000 UTC m=+53.118906559" observedRunningTime="2026-04-23 13:32:48.074253459 +0000 UTC m=+53.769158724" watchObservedRunningTime="2026-04-23 13:32:48.074826909 +0000 UTC m=+53.769732153" Apr 23 13:32:50.030055 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:50.030007 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" event={"ID":"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb","Type":"ContainerStarted","Data":"cb6434787abf8596ea8242c99f84f90a78b3b7fa58d428c43d7b48dcaad094b1"} Apr 23 13:32:50.030055 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:50.030053 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" event={"ID":"b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb","Type":"ContainerStarted","Data":"1d9b34a4b307c10736bcfaee2a292e3c967e1c329706ec9cb9944e2ded7bb068"} Apr 23 13:32:50.046946 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:50.046906 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-5ntk5" podStartSLOduration=4.429271119 podStartE2EDuration="6.046894322s" podCreationTimestamp="2026-04-23 13:32:44 +0000 UTC" firstStartedPulling="2026-04-23 13:32:47.759872582 +0000 UTC m=+53.454777836" lastFinishedPulling="2026-04-23 13:32:49.377495784 +0000 UTC m=+55.072401039" observedRunningTime="2026-04-23 13:32:50.045679694 +0000 UTC m=+55.740584959" watchObservedRunningTime="2026-04-23 13:32:50.046894322 +0000 UTC m=+55.741799587" Apr 23 13:32:52.159124 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.159092 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk"] Apr 23 13:32:52.162380 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.162366 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.164813 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.164785 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 13:32:52.164813 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.164802 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 13:32:52.165082 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.165068 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-xjp4b\"" Apr 23 13:32:52.173138 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.173120 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk"] Apr 23 13:32:52.180555 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.180533 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vs52"] Apr 23 13:32:52.184004 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.183990 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.186549 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.186535 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 13:32:52.186629 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.186582 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-vzn8j\"" Apr 23 13:32:52.186629 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.186610 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 13:32:52.186828 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.186813 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 13:32:52.195339 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.195320 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vs52"] Apr 23 13:32:52.206950 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.206933 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mmtpr"] Apr 23 13:32:52.210464 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.210440 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.214481 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.214459 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:32:52.214481 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.214474 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:32:52.214620 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.214488 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-75lnf\"" Apr 23 13:32:52.214620 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.214581 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:32:52.231664 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231642 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkv7r\" (UniqueName: \"kubernetes.io/projected/e291a639-f923-4a04-9d39-8c585ab8e111-kube-api-access-vkv7r\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.231760 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231676 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.231760 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231693 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f81a213-c585-418c-a4c7-d571e37e829d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.231760 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231738 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.231878 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231773 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.231878 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231790 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e291a639-f923-4a04-9d39-8c585ab8e111-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.231878 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231807 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3fc0734-b575-4b05-b44c-8457c8db77d5-metrics-client-ca\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.231878 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231830 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-sys\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.231878 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231846 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-tls\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.232077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231885 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.232077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231911 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.232077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231929 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-root\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.232077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231960 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkh4t\" (UniqueName: \"kubernetes.io/projected/7f81a213-c585-418c-a4c7-d571e37e829d-kube-api-access-mkh4t\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.232077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.231990 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7f81a213-c585-418c-a4c7-d571e37e829d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.232077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.232027 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-textfile\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.232077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.232055 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg28c\" (UniqueName: \"kubernetes.io/projected/b3fc0734-b575-4b05-b44c-8457c8db77d5-kube-api-access-sg28c\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.232077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.232073 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.232295 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.232090 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-wtmp\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.232295 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.232105 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333327 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333302 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7f81a213-c585-418c-a4c7-d571e37e829d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.333327 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333328 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-textfile\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333449 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333344 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg28c\" (UniqueName: \"kubernetes.io/projected/b3fc0734-b575-4b05-b44c-8457c8db77d5-kube-api-access-sg28c\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333449 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333361 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.333449 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333378 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-wtmp\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333601 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:52.333476 2562 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 13:32:52.333601 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333499 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-wtmp\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333601 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333517 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333601 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:52.333541 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-tls podName:7f81a213-c585-418c-a4c7-d571e37e829d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:52.833524036 +0000 UTC m=+58.528429289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-7vs52" (UID: "7f81a213-c585-418c-a4c7-d571e37e829d") : secret "kube-state-metrics-tls" not found Apr 23 13:32:52.333601 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333570 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkv7r\" (UniqueName: \"kubernetes.io/projected/e291a639-f923-4a04-9d39-8c585ab8e111-kube-api-access-vkv7r\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333611 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333644 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f81a213-c585-418c-a4c7-d571e37e829d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333677 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333707 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333732 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e291a639-f923-4a04-9d39-8c585ab8e111-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333759 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3fc0734-b575-4b05-b44c-8457c8db77d5-metrics-client-ca\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333783 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-sys\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333809 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-tls\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.333851 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:52.333829 2562 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333837 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333679 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-textfile\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.333888 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:52.333960 2562 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:52.334000 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-tls podName:b3fc0734-b575-4b05-b44c-8457c8db77d5 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:52.833984473 +0000 UTC m=+58.528889721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-tls") pod "node-exporter-mmtpr" (UID: "b3fc0734-b575-4b05-b44c-8457c8db77d5") : secret "node-exporter-tls" not found Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334071 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-sys\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334131 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-root\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334181 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkh4t\" (UniqueName: \"kubernetes.io/projected/7f81a213-c585-418c-a4c7-d571e37e829d-kube-api-access-mkh4t\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.334309 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334298 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7f81a213-c585-418c-a4c7-d571e37e829d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.334745 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334310 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.334745 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334362 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3fc0734-b575-4b05-b44c-8457c8db77d5-metrics-client-ca\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.334745 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334381 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f81a213-c585-418c-a4c7-d571e37e829d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.334745 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334368 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b3fc0734-b575-4b05-b44c-8457c8db77d5-root\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.334745 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:52.334410 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-tls podName:e291a639-f923-4a04-9d39-8c585ab8e111 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:52.834391009 +0000 UTC m=+58.529296266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-8vwjk" (UID: "e291a639-f923-4a04-9d39-8c585ab8e111") : secret "openshift-state-metrics-tls" not found Apr 23 13:32:52.335045 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.334809 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e291a639-f923-4a04-9d39-8c585ab8e111-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.335045 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.335007 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.337475 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.337454 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.337552 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.337494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.337603 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.337588 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.341926 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.341896 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg28c\" (UniqueName: \"kubernetes.io/projected/b3fc0734-b575-4b05-b44c-8457c8db77d5-kube-api-access-sg28c\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.342192 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.342175 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkv7r\" (UniqueName: \"kubernetes.io/projected/e291a639-f923-4a04-9d39-8c585ab8e111-kube-api-access-vkv7r\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.342536 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.342516 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkh4t\" (UniqueName: \"kubernetes.io/projected/7f81a213-c585-418c-a4c7-d571e37e829d-kube-api-access-mkh4t\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.837732 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.837705 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.837898 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.837746 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-tls\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.837968 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.837934 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.840094 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.840068 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b3fc0734-b575-4b05-b44c-8457c8db77d5-node-exporter-tls\") pod \"node-exporter-mmtpr\" (UID: \"b3fc0734-b575-4b05-b44c-8457c8db77d5\") " pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:52.840203 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.840116 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e291a639-f923-4a04-9d39-8c585ab8e111-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8vwjk\" (UID: \"e291a639-f923-4a04-9d39-8c585ab8e111\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:52.840203 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.840169 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f81a213-c585-418c-a4c7-d571e37e829d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vs52\" (UID: \"7f81a213-c585-418c-a4c7-d571e37e829d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:52.970210 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:52.970186 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmczc" Apr 23 13:32:53.070870 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.070844 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" Apr 23 13:32:53.092484 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.092436 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" Apr 23 13:32:53.122304 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.122201 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mmtpr" Apr 23 13:32:53.138380 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:53.138335 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3fc0734_b575_4b05_b44c_8457c8db77d5.slice/crio-9c3885e5165a81954eb309b98dfa52f02880e734c9e9dc39d59473f80a986052 WatchSource:0}: Error finding container 9c3885e5165a81954eb309b98dfa52f02880e734c9e9dc39d59473f80a986052: Status 404 returned error can't find the container with id 9c3885e5165a81954eb309b98dfa52f02880e734c9e9dc39d59473f80a986052 Apr 23 13:32:53.211756 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.211729 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk"] Apr 23 13:32:53.214957 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:53.214932 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode291a639_f923_4a04_9d39_8c585ab8e111.slice/crio-d26ed94821848adad8e32e1a2f078ec7bf9331b1abc1c28a36b4d0f86a33958a WatchSource:0}: Error finding container d26ed94821848adad8e32e1a2f078ec7bf9331b1abc1c28a36b4d0f86a33958a: Status 404 returned error can't find the container with id d26ed94821848adad8e32e1a2f078ec7bf9331b1abc1c28a36b4d0f86a33958a Apr 23 13:32:53.226787 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.226758 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vs52"] Apr 23 13:32:53.229228 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:53.229207 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f81a213_c585_418c_a4c7_d571e37e829d.slice/crio-2ab0311c5c450a984e34067b46eada08af1742f634c97999239a9cd90fdc6e1d WatchSource:0}: Error finding container 2ab0311c5c450a984e34067b46eada08af1742f634c97999239a9cd90fdc6e1d: Status 404 returned error can't find the container with id 2ab0311c5c450a984e34067b46eada08af1742f634c97999239a9cd90fdc6e1d Apr 23 13:32:53.291698 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.291676 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:32:53.297515 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.296821 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.299994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.299974 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 13:32:53.300150 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.300127 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-twjdz\"" Apr 23 13:32:53.300220 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.299987 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 13:32:53.300304 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.300282 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 13:32:53.300724 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.300697 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 13:32:53.301046 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.300985 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 13:32:53.301133 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.301076 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 13:32:53.301433 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.301268 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 13:32:53.301433 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.301353 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 13:32:53.301771 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.301751 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 13:32:53.310827 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.310803 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:32:53.343055 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.342980 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343055 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343032 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-config-out\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343109 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343148 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343171 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343214 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343194 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343414 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343257 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-web-config\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343414 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343314 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343414 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343353 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-config-volume\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343414 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343385 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343414 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343409 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343708 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343453 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.343708 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.343480 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqz7j\" (UniqueName: \"kubernetes.io/projected/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-kube-api-access-kqz7j\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444520 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444485 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444520 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444522 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444729 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444543 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444729 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444564 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444729 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444590 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-web-config\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444875 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444786 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444875 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444846 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-config-volume\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444975 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444883 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444975 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444910 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.444975 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444960 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.445158 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.444987 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqz7j\" (UniqueName: \"kubernetes.io/projected/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-kube-api-access-kqz7j\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.445158 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.445063 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.445158 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:53.445085 2562 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 13:32:53.445158 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.445097 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-config-out\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.445341 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:32:53.445168 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-main-tls podName:4b9e094f-6f9f-4001-aadf-4b45daf3c0fa nodeName:}" failed. No retries permitted until 2026-04-23 13:32:53.945147825 +0000 UTC m=+59.640053086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "4b9e094f-6f9f-4001-aadf-4b45daf3c0fa") : secret "alertmanager-main-tls" not found Apr 23 13:32:53.445461 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.445441 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.445535 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.445503 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.446326 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.446278 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.447625 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.447605 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.447732 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.447686 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-web-config\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.448200 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.448148 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.448200 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.448169 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-config-out\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.448375 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.448356 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.448501 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.448483 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-config-volume\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.448722 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.448705 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.448856 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.448841 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.454213 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.454194 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqz7j\" (UniqueName: \"kubernetes.io/projected/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-kube-api-access-kqz7j\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.949365 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.949316 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:53.953612 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:53.953305 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4b9e094f-6f9f-4001-aadf-4b45daf3c0fa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:54.042548 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.042508 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mmtpr" event={"ID":"b3fc0734-b575-4b05-b44c-8457c8db77d5","Type":"ContainerStarted","Data":"9c3885e5165a81954eb309b98dfa52f02880e734c9e9dc39d59473f80a986052"} Apr 23 13:32:54.044416 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.044390 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" event={"ID":"e291a639-f923-4a04-9d39-8c585ab8e111","Type":"ContainerStarted","Data":"30dbfa97570d7d42aafeb59db1f77a331a800c544e6b558281caecfd901661af"} Apr 23 13:32:54.044522 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.044426 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" event={"ID":"e291a639-f923-4a04-9d39-8c585ab8e111","Type":"ContainerStarted","Data":"21ab221840643f9dd29b9ad394322e9e0ad81d0ee6943e00fe2ef4294533dca6"} Apr 23 13:32:54.044522 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.044441 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" event={"ID":"e291a639-f923-4a04-9d39-8c585ab8e111","Type":"ContainerStarted","Data":"d26ed94821848adad8e32e1a2f078ec7bf9331b1abc1c28a36b4d0f86a33958a"} Apr 23 13:32:54.045500 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.045475 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" event={"ID":"7f81a213-c585-418c-a4c7-d571e37e829d","Type":"ContainerStarted","Data":"2ab0311c5c450a984e34067b46eada08af1742f634c97999239a9cd90fdc6e1d"} Apr 23 13:32:54.219436 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.219351 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:32:54.261208 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.261185 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-75d9456954-qwsqf"] Apr 23 13:32:54.264486 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.264454 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.267284 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.267238 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 13:32:54.267284 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.267253 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 13:32:54.267452 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.267248 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 13:32:54.267509 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.267460 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-tndjf\"" Apr 23 13:32:54.267806 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.267786 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 13:32:54.268700 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.268546 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 13:32:54.268700 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.268595 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-73d2412vgfdak\"" Apr 23 13:32:54.276832 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.276798 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-75d9456954-qwsqf"] Apr 23 13:32:54.352374 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.352349 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2sfc\" (UniqueName: \"kubernetes.io/projected/b8501b35-5e05-402b-b003-7a7a5e9a5a84-kube-api-access-m2sfc\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.352517 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.352391 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.352517 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.352436 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8501b35-5e05-402b-b003-7a7a5e9a5a84-metrics-client-ca\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.352517 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.352490 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.352694 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.352527 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.352694 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.352551 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-grpc-tls\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.352694 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.352578 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.352694 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.352609 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-tls\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.453735 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.453688 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.453735 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.453730 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-grpc-tls\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.453946 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.453760 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.453946 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.453794 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-tls\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.453946 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.453851 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2sfc\" (UniqueName: \"kubernetes.io/projected/b8501b35-5e05-402b-b003-7a7a5e9a5a84-kube-api-access-m2sfc\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.453946 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.453893 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.453946 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.453939 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8501b35-5e05-402b-b003-7a7a5e9a5a84-metrics-client-ca\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.454203 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.453971 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.454946 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.454893 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8501b35-5e05-402b-b003-7a7a5e9a5a84-metrics-client-ca\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.456821 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.456794 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.457295 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.457254 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-tls\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.457422 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.457388 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-grpc-tls\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.457532 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.457401 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.457532 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.457499 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.457532 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.457511 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8501b35-5e05-402b-b003-7a7a5e9a5a84-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.462641 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.462619 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2sfc\" (UniqueName: \"kubernetes.io/projected/b8501b35-5e05-402b-b003-7a7a5e9a5a84-kube-api-access-m2sfc\") pod \"thanos-querier-75d9456954-qwsqf\" (UID: \"b8501b35-5e05-402b-b003-7a7a5e9a5a84\") " pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.576477 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.576058 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:32:54.825442 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.825382 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:32:54.850129 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:54.850107 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-75d9456954-qwsqf"] Apr 23 13:32:54.888444 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:54.888409 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8501b35_5e05_402b_b003_7a7a5e9a5a84.slice/crio-f427874b1a65ba7938ecf9337cf9a9ac1122486a544d1948e4b9595418745e6b WatchSource:0}: Error finding container f427874b1a65ba7938ecf9337cf9a9ac1122486a544d1948e4b9595418745e6b: Status 404 returned error can't find the container with id f427874b1a65ba7938ecf9337cf9a9ac1122486a544d1948e4b9595418745e6b Apr 23 13:32:55.049856 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.049822 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" event={"ID":"b8501b35-5e05-402b-b003-7a7a5e9a5a84","Type":"ContainerStarted","Data":"f427874b1a65ba7938ecf9337cf9a9ac1122486a544d1948e4b9595418745e6b"} Apr 23 13:32:55.050835 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.050807 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa","Type":"ContainerStarted","Data":"03c23063f4edc31ceed79c0072d3a5c0c4f28c3dc34a9234e1868e5522e53f29"} Apr 23 13:32:55.052623 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.052599 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" event={"ID":"7f81a213-c585-418c-a4c7-d571e37e829d","Type":"ContainerStarted","Data":"3a34215bb2fd0ebff195354a3069185a282dd65a563dc23ec000793b3bc50212"} Apr 23 13:32:55.052718 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.052630 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" event={"ID":"7f81a213-c585-418c-a4c7-d571e37e829d","Type":"ContainerStarted","Data":"680bb345c7012f94482f446aa22f39cb1631feb83699059d15e722fb6f210cb9"} Apr 23 13:32:55.052718 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.052644 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" event={"ID":"7f81a213-c585-418c-a4c7-d571e37e829d","Type":"ContainerStarted","Data":"f80716dc4bdd57b63e33bca6baf7c26de9b05cb97ffcec04c95fa57a1b2ffa66"} Apr 23 13:32:55.053993 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.053967 2562 generic.go:358] "Generic (PLEG): container finished" podID="b3fc0734-b575-4b05-b44c-8457c8db77d5" containerID="02654fa85d48020cfbed7c4fba3917a32a75497d767fe1dd17f0133148e1f238" exitCode=0 Apr 23 13:32:55.054092 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.054045 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mmtpr" event={"ID":"b3fc0734-b575-4b05-b44c-8457c8db77d5","Type":"ContainerDied","Data":"02654fa85d48020cfbed7c4fba3917a32a75497d767fe1dd17f0133148e1f238"} Apr 23 13:32:55.056296 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.056273 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" event={"ID":"e291a639-f923-4a04-9d39-8c585ab8e111","Type":"ContainerStarted","Data":"3567a4d21fdfc29d3d41d3a58d10e87b24f299387991fd8f6b41819fa0eb450e"} Apr 23 13:32:55.071649 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.071393 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vs52" podStartSLOduration=1.643424799 podStartE2EDuration="3.071377527s" podCreationTimestamp="2026-04-23 13:32:52 +0000 UTC" firstStartedPulling="2026-04-23 13:32:53.230874383 +0000 UTC m=+58.925779626" lastFinishedPulling="2026-04-23 13:32:54.658827107 +0000 UTC m=+60.353732354" observedRunningTime="2026-04-23 13:32:55.070803474 +0000 UTC m=+60.765708742" watchObservedRunningTime="2026-04-23 13:32:55.071377527 +0000 UTC m=+60.766282795" Apr 23 13:32:55.089676 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:55.089599 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8vwjk" podStartSLOduration=1.768558717 podStartE2EDuration="3.089584244s" podCreationTimestamp="2026-04-23 13:32:52 +0000 UTC" firstStartedPulling="2026-04-23 13:32:53.33519188 +0000 UTC m=+59.030097126" lastFinishedPulling="2026-04-23 13:32:54.656217394 +0000 UTC m=+60.351122653" observedRunningTime="2026-04-23 13:32:55.088446825 +0000 UTC m=+60.783352091" watchObservedRunningTime="2026-04-23 13:32:55.089584244 +0000 UTC m=+60.784489511" Apr 23 13:32:56.061188 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.061157 2562 generic.go:358] "Generic (PLEG): container finished" podID="4b9e094f-6f9f-4001-aadf-4b45daf3c0fa" containerID="9ffe513a55f5204627970505a922cfc655557e79a7a01c83cd655f2ac671f8ed" exitCode=0 Apr 23 13:32:56.061594 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.061246 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa","Type":"ContainerDied","Data":"9ffe513a55f5204627970505a922cfc655557e79a7a01c83cd655f2ac671f8ed"} Apr 23 13:32:56.063435 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.063402 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mmtpr" event={"ID":"b3fc0734-b575-4b05-b44c-8457c8db77d5","Type":"ContainerStarted","Data":"c2dae54c465cc6f85e5141fc88271c1846e90d2151b3c90d7f55d602058c938c"} Apr 23 13:32:56.063533 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.063435 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mmtpr" event={"ID":"b3fc0734-b575-4b05-b44c-8457c8db77d5","Type":"ContainerStarted","Data":"fbfbb33d89c9b37c1dc79deda812ea68f01fde40cd00015cc3eb52f9a36a877f"} Apr 23 13:32:56.115894 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.115832 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mmtpr" podStartSLOduration=3.20449822 podStartE2EDuration="4.115815502s" podCreationTimestamp="2026-04-23 13:32:52 +0000 UTC" firstStartedPulling="2026-04-23 13:32:53.142383304 +0000 UTC m=+58.837288556" lastFinishedPulling="2026-04-23 13:32:54.053700591 +0000 UTC m=+59.748605838" observedRunningTime="2026-04-23 13:32:56.115196677 +0000 UTC m=+61.810101944" watchObservedRunningTime="2026-04-23 13:32:56.115815502 +0000 UTC m=+61.810720767" Apr 23 13:32:56.657344 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.657310 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6f64b665bd-xtqts"] Apr 23 13:32:56.660545 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.660526 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.663595 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.663567 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 13:32:56.663595 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.663588 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-78blb\"" Apr 23 13:32:56.664804 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.664775 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-6m5r3m99cvtq2\"" Apr 23 13:32:56.664804 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.664799 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 13:32:56.664996 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.664811 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:32:56.665581 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.665566 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 13:32:56.683984 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.683959 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f64b665bd-xtqts"] Apr 23 13:32:56.775471 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.775432 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-client-ca-bundle\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.775638 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.775505 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-metrics-server-audit-profiles\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.775638 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.775538 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-secret-metrics-server-tls\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.775638 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.775582 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-audit-log\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.775802 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.775668 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.775802 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.775729 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-secret-metrics-server-client-certs\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.775802 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.775752 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbls8\" (UniqueName: \"kubernetes.io/projected/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-kube-api-access-tbls8\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.876454 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.876346 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-secret-metrics-server-client-certs\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.876454 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.876397 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbls8\" (UniqueName: \"kubernetes.io/projected/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-kube-api-access-tbls8\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.876454 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.876435 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-client-ca-bundle\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.876892 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.876485 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-metrics-server-audit-profiles\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.876892 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.876515 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-secret-metrics-server-tls\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.876892 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.876555 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-audit-log\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.876892 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.876602 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.878251 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.877215 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-audit-log\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.878251 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.877786 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.878714 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.878641 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-metrics-server-audit-profiles\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.883367 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.879368 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-client-ca-bundle\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.883367 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.883352 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-secret-metrics-server-tls\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.883541 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.883420 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-secret-metrics-server-client-certs\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.896363 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.896334 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbls8\" (UniqueName: \"kubernetes.io/projected/3f1cf3ec-73d0-491f-a5cd-95d17a7fac99-kube-api-access-tbls8\") pod \"metrics-server-6f64b665bd-xtqts\" (UID: \"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99\") " pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:56.971994 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:56.971960 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:32:57.397590 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.397558 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c45c69566-kkp2r"] Apr 23 13:32:57.418516 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.418478 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.420730 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.420688 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c45c69566-kkp2r"] Apr 23 13:32:57.421427 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.421390 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7ghqn\"" Apr 23 13:32:57.421427 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.421410 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:32:57.421710 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.421694 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:32:57.421771 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.421719 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 13:32:57.422426 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.422405 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:32:57.422895 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.422745 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 13:32:57.422895 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.422759 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:32:57.423147 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.423113 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:32:57.428277 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.427997 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 13:32:57.466537 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.466486 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f64b665bd-xtqts"] Apr 23 13:32:57.475579 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:57.475550 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f1cf3ec_73d0_491f_a5cd_95d17a7fac99.slice/crio-02f64dd0f32703fd029e54ba4a5eda2ca00b9e3d9ef4180ec8ff9252abdaa403 WatchSource:0}: Error finding container 02f64dd0f32703fd029e54ba4a5eda2ca00b9e3d9ef4180ec8ff9252abdaa403: Status 404 returned error can't find the container with id 02f64dd0f32703fd029e54ba4a5eda2ca00b9e3d9ef4180ec8ff9252abdaa403 Apr 23 13:32:57.482529 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.482505 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-trusted-ca-bundle\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.482619 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.482551 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-service-ca\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.482619 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.482581 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zcjg\" (UniqueName: \"kubernetes.io/projected/2565990d-8384-47d4-bc6e-26a4390db2cf-kube-api-access-7zcjg\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.482741 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.482624 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-oauth-config\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.482741 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.482697 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-serving-cert\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.482841 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.482742 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-console-config\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.482841 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.482762 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-oauth-serving-cert\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583139 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583112 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-console-config\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583252 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583144 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-oauth-serving-cert\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583252 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583175 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-trusted-ca-bundle\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583252 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583196 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-service-ca\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583252 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583215 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zcjg\" (UniqueName: \"kubernetes.io/projected/2565990d-8384-47d4-bc6e-26a4390db2cf-kube-api-access-7zcjg\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583252 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583235 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-oauth-config\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583506 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583273 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-serving-cert\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583903 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583877 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-console-config\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.583903 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583891 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-oauth-serving-cert\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.584062 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.583944 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-service-ca\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.584216 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.584199 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-trusted-ca-bundle\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.585585 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.585565 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-oauth-config\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.585707 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.585691 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-serving-cert\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.593034 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.592986 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zcjg\" (UniqueName: \"kubernetes.io/projected/2565990d-8384-47d4-bc6e-26a4390db2cf-kube-api-access-7zcjg\") pod \"console-5c45c69566-kkp2r\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:57.730566 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:57.730523 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:32:58.028475 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.028411 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9b4c5" Apr 23 13:32:58.072219 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.072154 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" event={"ID":"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99","Type":"ContainerStarted","Data":"02f64dd0f32703fd029e54ba4a5eda2ca00b9e3d9ef4180ec8ff9252abdaa403"} Apr 23 13:32:58.189551 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.189364 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c45c69566-kkp2r"] Apr 23 13:32:58.193311 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:58.193270 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2565990d_8384_47d4_bc6e_26a4390db2cf.slice/crio-a9b00aaeb86959ddb89e033a7110052865b6cc2e38dcd94f7d86bca10d40e084 WatchSource:0}: Error finding container a9b00aaeb86959ddb89e033a7110052865b6cc2e38dcd94f7d86bca10d40e084: Status 404 returned error can't find the container with id a9b00aaeb86959ddb89e033a7110052865b6cc2e38dcd94f7d86bca10d40e084 Apr 23 13:32:58.388126 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.388096 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:32:58.410335 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.410305 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.415452 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.413627 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 13:32:58.415452 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.414069 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-flh8l\"" Apr 23 13:32:58.415452 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.414673 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:32:58.416613 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.416578 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.416864 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.417207 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.417417 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.417607 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.417846 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.417854 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.418078 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.418119 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-at6ng3q6ul7sq\"" Apr 23 13:32:58.418670 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.418246 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 13:32:58.423547 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.421987 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 13:32:58.429073 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.428215 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 13:32:58.493084 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493008 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493226 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493099 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493226 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493132 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1da43892-0eae-4a1d-ade8-6a928a990187-config-out\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493226 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493158 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493226 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493204 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2d45\" (UniqueName: \"kubernetes.io/projected/1da43892-0eae-4a1d-ade8-6a928a990187-kube-api-access-w2d45\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493480 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493236 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493480 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493264 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493480 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493287 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-web-config\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493480 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493334 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-config\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493480 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493364 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493480 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493389 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493480 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493447 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493823 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493488 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493823 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493517 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493823 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493562 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493823 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493589 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493823 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493627 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.493823 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.493657 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1da43892-0eae-4a1d-ade8-6a928a990187-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594224 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594202 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594413 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594239 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594413 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594258 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594413 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594301 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594413 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594325 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594413 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594354 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594413 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594380 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1da43892-0eae-4a1d-ade8-6a928a990187-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594413 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594412 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594453 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594486 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1da43892-0eae-4a1d-ade8-6a928a990187-config-out\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594511 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594548 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2d45\" (UniqueName: \"kubernetes.io/projected/1da43892-0eae-4a1d-ade8-6a928a990187-kube-api-access-w2d45\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594580 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594609 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594631 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-web-config\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594680 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-config\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594714 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.594782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.594742 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.595923 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.595486 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.596697 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.596619 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.597850 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.597790 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.598882 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.598658 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.598882 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.598677 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.603538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.599936 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-web-config\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.603538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.600000 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.603538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.600835 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1da43892-0eae-4a1d-ade8-6a928a990187-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.603538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.601573 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.603538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.602446 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.603538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.602578 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.603538 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.603371 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1da43892-0eae-4a1d-ade8-6a928a990187-config-out\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.604404 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.604380 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1da43892-0eae-4a1d-ade8-6a928a990187-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.604855 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.604832 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.605314 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.605291 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-config\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.605718 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.605662 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.606891 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.606871 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1da43892-0eae-4a1d-ade8-6a928a990187-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.607903 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.607874 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2d45\" (UniqueName: \"kubernetes.io/projected/1da43892-0eae-4a1d-ade8-6a928a990187-kube-api-access-w2d45\") pod \"prometheus-k8s-0\" (UID: \"1da43892-0eae-4a1d-ade8-6a928a990187\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.759289 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.759258 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:32:58.925392 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:58.925367 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:32:58.933100 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:32:58.930750 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1da43892_0eae_4a1d_ade8_6a928a990187.slice/crio-7c97dc6715e0c28e4447ff805a880227b2f642a10e8d09ed9f9d30edae5ac334 WatchSource:0}: Error finding container 7c97dc6715e0c28e4447ff805a880227b2f642a10e8d09ed9f9d30edae5ac334: Status 404 returned error can't find the container with id 7c97dc6715e0c28e4447ff805a880227b2f642a10e8d09ed9f9d30edae5ac334 Apr 23 13:32:59.078281 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.078220 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerStarted","Data":"8cb37a567f90609d4c0d941ca6d39569eedaff1fba7f0fb39f896d61fda8b2e4"} Apr 23 13:32:59.078281 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.078261 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerStarted","Data":"7c97dc6715e0c28e4447ff805a880227b2f642a10e8d09ed9f9d30edae5ac334"} Apr 23 13:32:59.082132 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.081888 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" event={"ID":"b8501b35-5e05-402b-b003-7a7a5e9a5a84","Type":"ContainerStarted","Data":"07b7399f90d55f6c123af2942bc85e70671830a6275aced01d5a292a67a7b703"} Apr 23 13:32:59.082132 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.081921 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" event={"ID":"b8501b35-5e05-402b-b003-7a7a5e9a5a84","Type":"ContainerStarted","Data":"7c79133b13d3bf995598ab1576d8243051e1abc879bc725b8217005b1b4225ba"} Apr 23 13:32:59.082132 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.081934 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" event={"ID":"b8501b35-5e05-402b-b003-7a7a5e9a5a84","Type":"ContainerStarted","Data":"be4837a2cce3ffcf2006d88ea000605e30fd3a07526dfc3226215d2bdb9822e4"} Apr 23 13:32:59.088037 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.087922 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa","Type":"ContainerStarted","Data":"995bcf2b3ca6ad909f61a5ec174a8ea5b7ecf98db3320e5a78f4c18b380012b3"} Apr 23 13:32:59.088037 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.087953 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa","Type":"ContainerStarted","Data":"429f22909509dc4f1013ce759b17f9b69abe1e83d75893bb3b721c6c7f7b67f7"} Apr 23 13:32:59.088037 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.087966 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa","Type":"ContainerStarted","Data":"c4e75ecb8b419d67c4f0e3cecbd33c563cb388c8b20c73c429be22390322aaa8"} Apr 23 13:32:59.088037 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.087980 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa","Type":"ContainerStarted","Data":"74cd0b5f6b41700956a9115179add5210e1af77fa2c3c3c1a2367dd5e00f4a9c"} Apr 23 13:32:59.088037 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.087993 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa","Type":"ContainerStarted","Data":"066df4b50e573b24ad98610ed7aa7a80081f529acf4b3646e25b05c014b709c8"} Apr 23 13:32:59.089827 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:32:59.089793 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c45c69566-kkp2r" event={"ID":"2565990d-8384-47d4-bc6e-26a4390db2cf","Type":"ContainerStarted","Data":"a9b00aaeb86959ddb89e033a7110052865b6cc2e38dcd94f7d86bca10d40e084"} Apr 23 13:33:00.097065 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.097033 2562 generic.go:358] "Generic (PLEG): container finished" podID="1da43892-0eae-4a1d-ade8-6a928a990187" containerID="8cb37a567f90609d4c0d941ca6d39569eedaff1fba7f0fb39f896d61fda8b2e4" exitCode=0 Apr 23 13:33:00.097451 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.097084 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerDied","Data":"8cb37a567f90609d4c0d941ca6d39569eedaff1fba7f0fb39f896d61fda8b2e4"} Apr 23 13:33:00.516488 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.516451 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:33:00.516763 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.516525 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:33:00.519483 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.519461 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:33:00.519589 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.519495 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:33:00.530042 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.530003 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8372a373-96b3-40a7-a175-86077c4b2030-metrics-certs\") pod \"network-metrics-daemon-kdj59\" (UID: \"8372a373-96b3-40a7-a175-86077c4b2030\") " pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:33:00.530125 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.530068 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:33:00.540680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.540662 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k7rr\" (UniqueName: \"kubernetes.io/projected/2a1fbae2-2df7-41eb-9ed9-aac09b5af692-kube-api-access-7k7rr\") pod \"network-check-target-xmjsn\" (UID: \"2a1fbae2-2df7-41eb-9ed9-aac09b5af692\") " pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:33:00.552315 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.552295 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6f6d\"" Apr 23 13:33:00.559032 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.558997 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfgql\"" Apr 23 13:33:00.560065 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.560047 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:33:00.567233 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:00.567213 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdj59" Apr 23 13:33:01.282465 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:01.282432 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kdj59"] Apr 23 13:33:01.287617 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:33:01.286527 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8372a373_96b3_40a7_a175_86077c4b2030.slice/crio-39cd98b80d570228edd64fee9ded2935758dd5e7168452bec9b187a2baab22fd WatchSource:0}: Error finding container 39cd98b80d570228edd64fee9ded2935758dd5e7168452bec9b187a2baab22fd: Status 404 returned error can't find the container with id 39cd98b80d570228edd64fee9ded2935758dd5e7168452bec9b187a2baab22fd Apr 23 13:33:01.308061 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:01.307999 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xmjsn"] Apr 23 13:33:01.311294 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:33:01.311268 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a1fbae2_2df7_41eb_9ed9_aac09b5af692.slice/crio-62648a31e9a54d09c535f1157a08d5042f827408617b9a566a099cdf4c00927f WatchSource:0}: Error finding container 62648a31e9a54d09c535f1157a08d5042f827408617b9a566a099cdf4c00927f: Status 404 returned error can't find the container with id 62648a31e9a54d09c535f1157a08d5042f827408617b9a566a099cdf4c00927f Apr 23 13:33:02.107034 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.106977 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xmjsn" event={"ID":"2a1fbae2-2df7-41eb-9ed9-aac09b5af692","Type":"ContainerStarted","Data":"62648a31e9a54d09c535f1157a08d5042f827408617b9a566a099cdf4c00927f"} Apr 23 13:33:02.110191 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.110156 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdj59" event={"ID":"8372a373-96b3-40a7-a175-86077c4b2030","Type":"ContainerStarted","Data":"39cd98b80d570228edd64fee9ded2935758dd5e7168452bec9b187a2baab22fd"} Apr 23 13:33:02.112116 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.112089 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" event={"ID":"3f1cf3ec-73d0-491f-a5cd-95d17a7fac99","Type":"ContainerStarted","Data":"2f8406b7e1d723f0033bc26426f29e099c7810c247d96278878270f3ed452223"} Apr 23 13:33:02.115952 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.115903 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" event={"ID":"b8501b35-5e05-402b-b003-7a7a5e9a5a84","Type":"ContainerStarted","Data":"6941a46035ad25e41938d4096ccf134c94d141fd0a0271c6ae327b034316ec00"} Apr 23 13:33:02.115952 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.115935 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" event={"ID":"b8501b35-5e05-402b-b003-7a7a5e9a5a84","Type":"ContainerStarted","Data":"6ded3ca92b22050555ca29405256adc22fa40df396161db07cac42192afd6aba"} Apr 23 13:33:02.115952 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.115948 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" event={"ID":"b8501b35-5e05-402b-b003-7a7a5e9a5a84","Type":"ContainerStarted","Data":"6ed9f94abae5766e531bc71f9621433c6ab9b88b383af3fb53b152e22d12638f"} Apr 23 13:33:02.116712 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.116693 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:33:02.121586 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.121543 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4b9e094f-6f9f-4001-aadf-4b45daf3c0fa","Type":"ContainerStarted","Data":"7c2cbcad35df5a3239bdf0a5baa01d92fda13a7680cf71b5a733116477a16c5b"} Apr 23 13:33:02.124675 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.124646 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c45c69566-kkp2r" event={"ID":"2565990d-8384-47d4-bc6e-26a4390db2cf","Type":"ContainerStarted","Data":"e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91"} Apr 23 13:33:02.133230 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.133091 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" podStartSLOduration=2.46944756 podStartE2EDuration="6.133074903s" podCreationTimestamp="2026-04-23 13:32:56 +0000 UTC" firstStartedPulling="2026-04-23 13:32:57.477800629 +0000 UTC m=+63.172705876" lastFinishedPulling="2026-04-23 13:33:01.14142797 +0000 UTC m=+66.836333219" observedRunningTime="2026-04-23 13:33:02.130109266 +0000 UTC m=+67.825014534" watchObservedRunningTime="2026-04-23 13:33:02.133074903 +0000 UTC m=+67.827980150" Apr 23 13:33:02.158950 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.157150 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.850006777 podStartE2EDuration="9.157134438s" podCreationTimestamp="2026-04-23 13:32:53 +0000 UTC" firstStartedPulling="2026-04-23 13:32:54.839169363 +0000 UTC m=+60.534074607" lastFinishedPulling="2026-04-23 13:33:01.146297006 +0000 UTC m=+66.841202268" observedRunningTime="2026-04-23 13:33:02.157103247 +0000 UTC m=+67.852008527" watchObservedRunningTime="2026-04-23 13:33:02.157134438 +0000 UTC m=+67.852039704" Apr 23 13:33:02.181423 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.181366 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" podStartSLOduration=1.932054111 podStartE2EDuration="8.181347808s" podCreationTimestamp="2026-04-23 13:32:54 +0000 UTC" firstStartedPulling="2026-04-23 13:32:54.892147105 +0000 UTC m=+60.587052354" lastFinishedPulling="2026-04-23 13:33:01.141440795 +0000 UTC m=+66.836346051" observedRunningTime="2026-04-23 13:33:02.179587546 +0000 UTC m=+67.874492813" watchObservedRunningTime="2026-04-23 13:33:02.181347808 +0000 UTC m=+67.876253075" Apr 23 13:33:02.199618 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:02.199568 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c45c69566-kkp2r" podStartSLOduration=2.19210633 podStartE2EDuration="5.199552848s" podCreationTimestamp="2026-04-23 13:32:57 +0000 UTC" firstStartedPulling="2026-04-23 13:32:58.196397641 +0000 UTC m=+63.891302899" lastFinishedPulling="2026-04-23 13:33:01.20384415 +0000 UTC m=+66.898749417" observedRunningTime="2026-04-23 13:33:02.198971385 +0000 UTC m=+67.893876651" watchObservedRunningTime="2026-04-23 13:33:02.199552848 +0000 UTC m=+67.894458114" Apr 23 13:33:03.131505 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:03.131415 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdj59" event={"ID":"8372a373-96b3-40a7-a175-86077c4b2030","Type":"ContainerStarted","Data":"0e043792444f9c1985fc10b43f0fa4739cd427c3e9d24ad1eb190c97314d58c9"} Apr 23 13:33:03.131505 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:03.131460 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdj59" event={"ID":"8372a373-96b3-40a7-a175-86077c4b2030","Type":"ContainerStarted","Data":"90fb35c8c7cd4b3543f078d4688964d0b5c55ec1ccbcb012012cd7d292f56d02"} Apr 23 13:33:03.138543 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:03.138515 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-75d9456954-qwsqf" Apr 23 13:33:03.149942 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:03.149862 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kdj59" podStartSLOduration=67.876155179 podStartE2EDuration="1m9.149847189s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:33:01.291688186 +0000 UTC m=+66.986593430" lastFinishedPulling="2026-04-23 13:33:02.565380178 +0000 UTC m=+68.260285440" observedRunningTime="2026-04-23 13:33:03.147287472 +0000 UTC m=+68.842192749" watchObservedRunningTime="2026-04-23 13:33:03.149847189 +0000 UTC m=+68.844752755" Apr 23 13:33:05.143965 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:05.143871 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerStarted","Data":"aea439e9d96465347f34edbab0edef6f2436f835dab19a97d8fec07a3e7aecf3"} Apr 23 13:33:05.143965 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:05.143915 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerStarted","Data":"d7dba054f2aea8fcb40180dd00373ec3bfe23750fdd176b7c785b98c9e633fd2"} Apr 23 13:33:05.146715 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:05.146599 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xmjsn" event={"ID":"2a1fbae2-2df7-41eb-9ed9-aac09b5af692","Type":"ContainerStarted","Data":"e82e65a75ef4afb56e160fea9841161632ffd51a4e779e282ee840c54bc09ee8"} Apr 23 13:33:05.146715 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:05.146683 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:33:05.162045 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:05.161814 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xmjsn" podStartSLOduration=67.487798456 podStartE2EDuration="1m11.161799442s" podCreationTimestamp="2026-04-23 13:31:54 +0000 UTC" firstStartedPulling="2026-04-23 13:33:01.31313508 +0000 UTC m=+67.008040323" lastFinishedPulling="2026-04-23 13:33:04.987136051 +0000 UTC m=+70.682041309" observedRunningTime="2026-04-23 13:33:05.161173598 +0000 UTC m=+70.856078868" watchObservedRunningTime="2026-04-23 13:33:05.161799442 +0000 UTC m=+70.856704708" Apr 23 13:33:06.151592 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:06.151560 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerStarted","Data":"5b7201edad12e57c6a6aa720d66cba338868f3c761e59af2329d96ba86b36d36"} Apr 23 13:33:06.151592 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:06.151595 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerStarted","Data":"75672a7ea63c253cb3c63064fcced6bf8579bde721b442bd53380dc3ddf4c53b"} Apr 23 13:33:06.152036 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:06.151604 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerStarted","Data":"a1aa85648742e3be91e7fb75d350480582034a37067b8457c7f043506046502d"} Apr 23 13:33:06.152036 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:06.151613 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1da43892-0eae-4a1d-ade8-6a928a990187","Type":"ContainerStarted","Data":"70261642eebb0280721d65be2c3803aaebf044b136a2662b7c8b9e6c5ba6c932"} Apr 23 13:33:06.190570 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:06.190516 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.30810307 podStartE2EDuration="8.190498752s" podCreationTimestamp="2026-04-23 13:32:58 +0000 UTC" firstStartedPulling="2026-04-23 13:33:00.100248017 +0000 UTC m=+65.795153263" lastFinishedPulling="2026-04-23 13:33:04.982643699 +0000 UTC m=+70.677548945" observedRunningTime="2026-04-23 13:33:06.182282508 +0000 UTC m=+71.877187785" watchObservedRunningTime="2026-04-23 13:33:06.190498752 +0000 UTC m=+71.885404009" Apr 23 13:33:07.731548 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:07.731517 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:33:07.731548 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:07.731553 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:33:07.736142 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:07.736120 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:33:08.161323 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:08.161297 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:33:08.760103 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:08.760068 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:33:16.972256 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:16.972217 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:33:16.972256 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:16.972259 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:33:36.977884 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:36.977854 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:33:36.981782 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:36.981755 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6f64b665bd-xtqts" Apr 23 13:33:37.156558 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:37.156522 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xmjsn" Apr 23 13:33:58.760174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:58.760121 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:33:58.779412 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:58.779378 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:33:59.323275 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:33:59.323242 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:16.504429 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.504392 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-59fbd9859d-xsx5t"] Apr 23 13:34:16.508044 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.508006 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.512832 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.512804 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 13:34:16.512945 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.512868 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 13:34:16.513180 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.513163 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 13:34:16.514196 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.514180 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 13:34:16.514865 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.514845 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 13:34:16.515057 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.515041 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5hzgk\"" Apr 23 13:34:16.543432 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.543400 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 13:34:16.556195 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.556159 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-59fbd9859d-xsx5t"] Apr 23 13:34:16.660469 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.660435 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-metrics-client-ca\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.660698 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.660496 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.660698 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.660588 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-serving-certs-ca-bundle\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.660698 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.660628 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.660698 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.660656 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-telemeter-client-tls\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.660865 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.660714 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-secret-telemeter-client\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.660865 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.660747 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-federate-client-tls\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.660865 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.660767 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2js6\" (UniqueName: \"kubernetes.io/projected/d42a4682-f0df-43ed-8498-91f164416584-kube-api-access-r2js6\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.761998 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.761875 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-metrics-client-ca\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.761998 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.761972 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.761998 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.762006 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-serving-certs-ca-bundle\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.762328 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.762046 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.762328 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.762064 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-telemeter-client-tls\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.762328 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.762093 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-secret-telemeter-client\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.762328 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.762115 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-federate-client-tls\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.762328 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.762131 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2js6\" (UniqueName: \"kubernetes.io/projected/d42a4682-f0df-43ed-8498-91f164416584-kube-api-access-r2js6\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.762722 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.762697 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-metrics-client-ca\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.762998 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.762970 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-serving-certs-ca-bundle\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.763317 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.763289 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42a4682-f0df-43ed-8498-91f164416584-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.764599 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.764580 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-telemeter-client-tls\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.764802 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.764776 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.764868 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.764822 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-federate-client-tls\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.764868 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.764835 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d42a4682-f0df-43ed-8498-91f164416584-secret-telemeter-client\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.771812 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.771788 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2js6\" (UniqueName: \"kubernetes.io/projected/d42a4682-f0df-43ed-8498-91f164416584-kube-api-access-r2js6\") pod \"telemeter-client-59fbd9859d-xsx5t\" (UID: \"d42a4682-f0df-43ed-8498-91f164416584\") " pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.818038 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.817989 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" Apr 23 13:34:16.951129 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:16.951094 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-59fbd9859d-xsx5t"] Apr 23 13:34:16.953796 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:34:16.953762 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd42a4682_f0df_43ed_8498_91f164416584.slice/crio-bd832483e1f6bc890f8dd5b6ee5554259dac0fbb5d9fbb8c2057397f08510ac6 WatchSource:0}: Error finding container bd832483e1f6bc890f8dd5b6ee5554259dac0fbb5d9fbb8c2057397f08510ac6: Status 404 returned error can't find the container with id bd832483e1f6bc890f8dd5b6ee5554259dac0fbb5d9fbb8c2057397f08510ac6 Apr 23 13:34:17.362732 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:17.362698 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" event={"ID":"d42a4682-f0df-43ed-8498-91f164416584","Type":"ContainerStarted","Data":"bd832483e1f6bc890f8dd5b6ee5554259dac0fbb5d9fbb8c2057397f08510ac6"} Apr 23 13:34:19.370987 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:19.370902 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" event={"ID":"d42a4682-f0df-43ed-8498-91f164416584","Type":"ContainerStarted","Data":"5da17094d1afca17de277c428319f6f85492b511a225df6c877bf885c9e476a2"} Apr 23 13:34:19.370987 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:19.370945 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" event={"ID":"d42a4682-f0df-43ed-8498-91f164416584","Type":"ContainerStarted","Data":"f5772d6de8b686f8f2a7d8791218fcd49484e57fc3c8a5e5cd81cf57922e22c3"} Apr 23 13:34:19.370987 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:19.370955 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" event={"ID":"d42a4682-f0df-43ed-8498-91f164416584","Type":"ContainerStarted","Data":"d9af80558c53c2e115da5d03bc32b93687b14aaea3ab597b8997f9496e3c82c0"} Apr 23 13:34:19.398540 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:19.398483 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-59fbd9859d-xsx5t" podStartSLOduration=1.39582847 podStartE2EDuration="3.398466816s" podCreationTimestamp="2026-04-23 13:34:16 +0000 UTC" firstStartedPulling="2026-04-23 13:34:16.955592736 +0000 UTC m=+142.650497980" lastFinishedPulling="2026-04-23 13:34:18.958231079 +0000 UTC m=+144.653136326" observedRunningTime="2026-04-23 13:34:19.395885851 +0000 UTC m=+145.090791117" watchObservedRunningTime="2026-04-23 13:34:19.398466816 +0000 UTC m=+145.093372081" Apr 23 13:34:38.422701 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:38.422604 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c45c69566-kkp2r"] Apr 23 13:34:51.556872 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.556838 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7hkfb"] Apr 23 13:34:51.559092 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.559072 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.561630 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.561612 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:34:51.569414 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.569391 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7hkfb"] Apr 23 13:34:51.666198 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.666166 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/133e0188-47e1-4d65-ae44-5988ea3a3fe8-original-pull-secret\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.666198 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.666202 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/133e0188-47e1-4d65-ae44-5988ea3a3fe8-kubelet-config\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.666397 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.666241 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/133e0188-47e1-4d65-ae44-5988ea3a3fe8-dbus\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.767575 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.767541 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/133e0188-47e1-4d65-ae44-5988ea3a3fe8-original-pull-secret\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.767733 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.767579 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/133e0188-47e1-4d65-ae44-5988ea3a3fe8-kubelet-config\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.767733 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.767618 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/133e0188-47e1-4d65-ae44-5988ea3a3fe8-dbus\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.767733 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.767722 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/133e0188-47e1-4d65-ae44-5988ea3a3fe8-kubelet-config\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.767877 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.767805 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/133e0188-47e1-4d65-ae44-5988ea3a3fe8-dbus\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.769762 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.769744 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/133e0188-47e1-4d65-ae44-5988ea3a3fe8-original-pull-secret\") pod \"global-pull-secret-syncer-7hkfb\" (UID: \"133e0188-47e1-4d65-ae44-5988ea3a3fe8\") " pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.867752 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.867654 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hkfb" Apr 23 13:34:51.980962 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:51.980917 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7hkfb"] Apr 23 13:34:51.983247 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:34:51.983219 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133e0188_47e1_4d65_ae44_5988ea3a3fe8.slice/crio-8de73e04826436f7a13b1f8306629b1da48392514de248f6453ea2291cc0869b WatchSource:0}: Error finding container 8de73e04826436f7a13b1f8306629b1da48392514de248f6453ea2291cc0869b: Status 404 returned error can't find the container with id 8de73e04826436f7a13b1f8306629b1da48392514de248f6453ea2291cc0869b Apr 23 13:34:52.465884 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:52.465849 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7hkfb" event={"ID":"133e0188-47e1-4d65-ae44-5988ea3a3fe8","Type":"ContainerStarted","Data":"8de73e04826436f7a13b1f8306629b1da48392514de248f6453ea2291cc0869b"} Apr 23 13:34:56.480549 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:56.480512 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7hkfb" event={"ID":"133e0188-47e1-4d65-ae44-5988ea3a3fe8","Type":"ContainerStarted","Data":"c02a7f61d9bb6ff81a8da079e5fb0c5e51017e5c75c29aa8adc49b5ee2b5a818"} Apr 23 13:34:56.496172 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:34:56.496120 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7hkfb" podStartSLOduration=1.718059283 podStartE2EDuration="5.49610574s" podCreationTimestamp="2026-04-23 13:34:51 +0000 UTC" firstStartedPulling="2026-04-23 13:34:51.985141312 +0000 UTC m=+177.680046556" lastFinishedPulling="2026-04-23 13:34:55.763187767 +0000 UTC m=+181.458093013" observedRunningTime="2026-04-23 13:34:56.494929146 +0000 UTC m=+182.189834414" watchObservedRunningTime="2026-04-23 13:34:56.49610574 +0000 UTC m=+182.191010984" Apr 23 13:35:03.442621 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.442543 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c45c69566-kkp2r" podUID="2565990d-8384-47d4-bc6e-26a4390db2cf" containerName="console" containerID="cri-o://e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91" gracePeriod=15 Apr 23 13:35:03.677091 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.677070 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c45c69566-kkp2r_2565990d-8384-47d4-bc6e-26a4390db2cf/console/0.log" Apr 23 13:35:03.677201 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.677138 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:35:03.772570 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.772492 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zcjg\" (UniqueName: \"kubernetes.io/projected/2565990d-8384-47d4-bc6e-26a4390db2cf-kube-api-access-7zcjg\") pod \"2565990d-8384-47d4-bc6e-26a4390db2cf\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " Apr 23 13:35:03.772570 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.772555 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-oauth-config\") pod \"2565990d-8384-47d4-bc6e-26a4390db2cf\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " Apr 23 13:35:03.772757 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.772592 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-trusted-ca-bundle\") pod \"2565990d-8384-47d4-bc6e-26a4390db2cf\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " Apr 23 13:35:03.772757 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.772617 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-serving-cert\") pod \"2565990d-8384-47d4-bc6e-26a4390db2cf\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " Apr 23 13:35:03.772757 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.772666 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-console-config\") pod \"2565990d-8384-47d4-bc6e-26a4390db2cf\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " Apr 23 13:35:03.772757 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.772730 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-service-ca\") pod \"2565990d-8384-47d4-bc6e-26a4390db2cf\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " Apr 23 13:35:03.772962 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.772829 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-oauth-serving-cert\") pod \"2565990d-8384-47d4-bc6e-26a4390db2cf\" (UID: \"2565990d-8384-47d4-bc6e-26a4390db2cf\") " Apr 23 13:35:03.773159 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.773098 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2565990d-8384-47d4-bc6e-26a4390db2cf" (UID: "2565990d-8384-47d4-bc6e-26a4390db2cf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:03.773408 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.773370 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-service-ca" (OuterVolumeSpecName: "service-ca") pod "2565990d-8384-47d4-bc6e-26a4390db2cf" (UID: "2565990d-8384-47d4-bc6e-26a4390db2cf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:03.773522 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.773428 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2565990d-8384-47d4-bc6e-26a4390db2cf" (UID: "2565990d-8384-47d4-bc6e-26a4390db2cf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:03.773522 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.773446 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-console-config" (OuterVolumeSpecName: "console-config") pod "2565990d-8384-47d4-bc6e-26a4390db2cf" (UID: "2565990d-8384-47d4-bc6e-26a4390db2cf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:03.774913 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.774882 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2565990d-8384-47d4-bc6e-26a4390db2cf" (UID: "2565990d-8384-47d4-bc6e-26a4390db2cf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:03.775188 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.775172 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2565990d-8384-47d4-bc6e-26a4390db2cf" (UID: "2565990d-8384-47d4-bc6e-26a4390db2cf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:03.775260 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.775171 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2565990d-8384-47d4-bc6e-26a4390db2cf-kube-api-access-7zcjg" (OuterVolumeSpecName: "kube-api-access-7zcjg") pod "2565990d-8384-47d4-bc6e-26a4390db2cf" (UID: "2565990d-8384-47d4-bc6e-26a4390db2cf"). InnerVolumeSpecName "kube-api-access-7zcjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:03.873669 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.873637 2562 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-console-config\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:03.873669 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.873659 2562 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-service-ca\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:03.873669 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.873670 2562 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-oauth-serving-cert\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:03.873852 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.873679 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zcjg\" (UniqueName: \"kubernetes.io/projected/2565990d-8384-47d4-bc6e-26a4390db2cf-kube-api-access-7zcjg\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:03.873852 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.873688 2562 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-oauth-config\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:03.873852 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.873697 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2565990d-8384-47d4-bc6e-26a4390db2cf-trusted-ca-bundle\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:03.873852 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:03.873705 2562 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2565990d-8384-47d4-bc6e-26a4390db2cf-console-serving-cert\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:04.505487 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.505462 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c45c69566-kkp2r_2565990d-8384-47d4-bc6e-26a4390db2cf/console/0.log" Apr 23 13:35:04.505887 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.505503 2562 generic.go:358] "Generic (PLEG): container finished" podID="2565990d-8384-47d4-bc6e-26a4390db2cf" containerID="e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91" exitCode=2 Apr 23 13:35:04.505887 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.505562 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c45c69566-kkp2r" event={"ID":"2565990d-8384-47d4-bc6e-26a4390db2cf","Type":"ContainerDied","Data":"e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91"} Apr 23 13:35:04.505887 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.505589 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c45c69566-kkp2r" event={"ID":"2565990d-8384-47d4-bc6e-26a4390db2cf","Type":"ContainerDied","Data":"a9b00aaeb86959ddb89e033a7110052865b6cc2e38dcd94f7d86bca10d40e084"} Apr 23 13:35:04.505887 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.505595 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c45c69566-kkp2r" Apr 23 13:35:04.505887 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.505603 2562 scope.go:117] "RemoveContainer" containerID="e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91" Apr 23 13:35:04.514706 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.514685 2562 scope.go:117] "RemoveContainer" containerID="e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91" Apr 23 13:35:04.514955 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:35:04.514933 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91\": container with ID starting with e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91 not found: ID does not exist" containerID="e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91" Apr 23 13:35:04.515098 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.514962 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91"} err="failed to get container status \"e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91\": rpc error: code = NotFound desc = could not find container \"e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91\": container with ID starting with e11ca3be70ffcbad42ed9c7a45dc07b66a95f5799c8f176df400aa2368157f91 not found: ID does not exist" Apr 23 13:35:04.527084 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.527060 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c45c69566-kkp2r"] Apr 23 13:35:04.529843 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.529822 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c45c69566-kkp2r"] Apr 23 13:35:04.841588 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:35:04.841483 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2565990d-8384-47d4-bc6e-26a4390db2cf" path="/var/lib/kubelet/pods/2565990d-8384-47d4-bc6e-26a4390db2cf/volumes" Apr 23 13:36:49.930914 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.930880 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-z8v6z"] Apr 23 13:36:49.931344 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.931224 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2565990d-8384-47d4-bc6e-26a4390db2cf" containerName="console" Apr 23 13:36:49.931344 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.931235 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2565990d-8384-47d4-bc6e-26a4390db2cf" containerName="console" Apr 23 13:36:49.931344 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.931292 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="2565990d-8384-47d4-bc6e-26a4390db2cf" containerName="console" Apr 23 13:36:49.933038 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.933000 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:49.935686 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.935666 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-lm4rn\"" Apr 23 13:36:49.935807 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.935726 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:36:49.936976 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.936956 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 13:36:49.937088 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.936985 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:36:49.944639 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.944622 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-z8v6z"] Apr 23 13:36:49.959502 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.959482 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84bdeda6-716d-427e-8e8d-6a537a2ef792-cert\") pod \"kserve-controller-manager-6b667fdd66-z8v6z\" (UID: \"84bdeda6-716d-427e-8e8d-6a537a2ef792\") " pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:49.959593 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:49.959510 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9mgz\" (UniqueName: \"kubernetes.io/projected/84bdeda6-716d-427e-8e8d-6a537a2ef792-kube-api-access-h9mgz\") pod \"kserve-controller-manager-6b667fdd66-z8v6z\" (UID: \"84bdeda6-716d-427e-8e8d-6a537a2ef792\") " pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:50.060240 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:50.060211 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84bdeda6-716d-427e-8e8d-6a537a2ef792-cert\") pod \"kserve-controller-manager-6b667fdd66-z8v6z\" (UID: \"84bdeda6-716d-427e-8e8d-6a537a2ef792\") " pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:50.060240 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:50.060243 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9mgz\" (UniqueName: \"kubernetes.io/projected/84bdeda6-716d-427e-8e8d-6a537a2ef792-kube-api-access-h9mgz\") pod \"kserve-controller-manager-6b667fdd66-z8v6z\" (UID: \"84bdeda6-716d-427e-8e8d-6a537a2ef792\") " pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:50.062531 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:50.062501 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84bdeda6-716d-427e-8e8d-6a537a2ef792-cert\") pod \"kserve-controller-manager-6b667fdd66-z8v6z\" (UID: \"84bdeda6-716d-427e-8e8d-6a537a2ef792\") " pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:50.069403 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:50.069382 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9mgz\" (UniqueName: \"kubernetes.io/projected/84bdeda6-716d-427e-8e8d-6a537a2ef792-kube-api-access-h9mgz\") pod \"kserve-controller-manager-6b667fdd66-z8v6z\" (UID: \"84bdeda6-716d-427e-8e8d-6a537a2ef792\") " pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:50.243484 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:50.243424 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:50.354414 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:50.354392 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-z8v6z"] Apr 23 13:36:50.356857 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:36:50.356830 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84bdeda6_716d_427e_8e8d_6a537a2ef792.slice/crio-534c107f0c344e9d3e2f2c94f890f234e366bd0e0f8a865c30c7ec8376892e92 WatchSource:0}: Error finding container 534c107f0c344e9d3e2f2c94f890f234e366bd0e0f8a865c30c7ec8376892e92: Status 404 returned error can't find the container with id 534c107f0c344e9d3e2f2c94f890f234e366bd0e0f8a865c30c7ec8376892e92 Apr 23 13:36:50.807954 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:50.807918 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" event={"ID":"84bdeda6-716d-427e-8e8d-6a537a2ef792","Type":"ContainerStarted","Data":"534c107f0c344e9d3e2f2c94f890f234e366bd0e0f8a865c30c7ec8376892e92"} Apr 23 13:36:53.819726 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:53.819692 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" event={"ID":"84bdeda6-716d-427e-8e8d-6a537a2ef792","Type":"ContainerStarted","Data":"a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a"} Apr 23 13:36:53.820077 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:53.819830 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:36:53.836258 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:53.836209 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" podStartSLOduration=2.224094808 podStartE2EDuration="4.836193036s" podCreationTimestamp="2026-04-23 13:36:49 +0000 UTC" firstStartedPulling="2026-04-23 13:36:50.357993515 +0000 UTC m=+296.052898759" lastFinishedPulling="2026-04-23 13:36:52.97009174 +0000 UTC m=+298.664996987" observedRunningTime="2026-04-23 13:36:53.835475616 +0000 UTC m=+299.530380882" watchObservedRunningTime="2026-04-23 13:36:53.836193036 +0000 UTC m=+299.531098303" Apr 23 13:36:54.724064 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:54.724034 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:36:54.725934 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:54.725913 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:36:54.730557 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:36:54.730536 2562 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:37:24.827945 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:24.827914 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:37:25.406551 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.406520 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-z8v6z"] Apr 23 13:37:25.406800 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.406714 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" podUID="84bdeda6-716d-427e-8e8d-6a537a2ef792" containerName="manager" containerID="cri-o://a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a" gracePeriod=10 Apr 23 13:37:25.434288 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.434264 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-l94wm"] Apr 23 13:37:25.437394 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.437379 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:25.443919 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.443898 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-l94wm"] Apr 23 13:37:25.515259 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.515230 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nz8g\" (UniqueName: \"kubernetes.io/projected/7d61f14b-ffb8-4300-a5dd-1700bee88999-kube-api-access-8nz8g\") pod \"kserve-controller-manager-6b667fdd66-l94wm\" (UID: \"7d61f14b-ffb8-4300-a5dd-1700bee88999\") " pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:25.515377 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.515355 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d61f14b-ffb8-4300-a5dd-1700bee88999-cert\") pod \"kserve-controller-manager-6b667fdd66-l94wm\" (UID: \"7d61f14b-ffb8-4300-a5dd-1700bee88999\") " pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:25.616598 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.616564 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nz8g\" (UniqueName: \"kubernetes.io/projected/7d61f14b-ffb8-4300-a5dd-1700bee88999-kube-api-access-8nz8g\") pod \"kserve-controller-manager-6b667fdd66-l94wm\" (UID: \"7d61f14b-ffb8-4300-a5dd-1700bee88999\") " pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:25.616729 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.616690 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d61f14b-ffb8-4300-a5dd-1700bee88999-cert\") pod \"kserve-controller-manager-6b667fdd66-l94wm\" (UID: \"7d61f14b-ffb8-4300-a5dd-1700bee88999\") " pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:25.619151 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.619121 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d61f14b-ffb8-4300-a5dd-1700bee88999-cert\") pod \"kserve-controller-manager-6b667fdd66-l94wm\" (UID: \"7d61f14b-ffb8-4300-a5dd-1700bee88999\") " pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:25.625265 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.625237 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nz8g\" (UniqueName: \"kubernetes.io/projected/7d61f14b-ffb8-4300-a5dd-1700bee88999-kube-api-access-8nz8g\") pod \"kserve-controller-manager-6b667fdd66-l94wm\" (UID: \"7d61f14b-ffb8-4300-a5dd-1700bee88999\") " pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:25.646528 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.646512 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:37:25.717413 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.717390 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9mgz\" (UniqueName: \"kubernetes.io/projected/84bdeda6-716d-427e-8e8d-6a537a2ef792-kube-api-access-h9mgz\") pod \"84bdeda6-716d-427e-8e8d-6a537a2ef792\" (UID: \"84bdeda6-716d-427e-8e8d-6a537a2ef792\") " Apr 23 13:37:25.717534 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.717513 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84bdeda6-716d-427e-8e8d-6a537a2ef792-cert\") pod \"84bdeda6-716d-427e-8e8d-6a537a2ef792\" (UID: \"84bdeda6-716d-427e-8e8d-6a537a2ef792\") " Apr 23 13:37:25.719359 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.719337 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bdeda6-716d-427e-8e8d-6a537a2ef792-cert" (OuterVolumeSpecName: "cert") pod "84bdeda6-716d-427e-8e8d-6a537a2ef792" (UID: "84bdeda6-716d-427e-8e8d-6a537a2ef792"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:37:25.719454 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.719357 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bdeda6-716d-427e-8e8d-6a537a2ef792-kube-api-access-h9mgz" (OuterVolumeSpecName: "kube-api-access-h9mgz") pod "84bdeda6-716d-427e-8e8d-6a537a2ef792" (UID: "84bdeda6-716d-427e-8e8d-6a537a2ef792"). InnerVolumeSpecName "kube-api-access-h9mgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:37:25.780574 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.780551 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:25.819027 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.818996 2562 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84bdeda6-716d-427e-8e8d-6a537a2ef792-cert\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:37:25.819138 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.819036 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9mgz\" (UniqueName: \"kubernetes.io/projected/84bdeda6-716d-427e-8e8d-6a537a2ef792-kube-api-access-h9mgz\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:37:25.893112 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.893088 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-l94wm"] Apr 23 13:37:25.895365 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:37:25.895343 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d61f14b_ffb8_4300_a5dd_1700bee88999.slice/crio-8463ac5c3bb61e2089ebf58be8d63bad4e89c039d8afb23f689c77cd03bb3006 WatchSource:0}: Error finding container 8463ac5c3bb61e2089ebf58be8d63bad4e89c039d8afb23f689c77cd03bb3006: Status 404 returned error can't find the container with id 8463ac5c3bb61e2089ebf58be8d63bad4e89c039d8afb23f689c77cd03bb3006 Apr 23 13:37:25.896583 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.896566 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:37:25.906821 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.906798 2562 generic.go:358] "Generic (PLEG): container finished" podID="84bdeda6-716d-427e-8e8d-6a537a2ef792" containerID="a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a" exitCode=0 Apr 23 13:37:25.906901 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.906864 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" Apr 23 13:37:25.906901 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.906884 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" event={"ID":"84bdeda6-716d-427e-8e8d-6a537a2ef792","Type":"ContainerDied","Data":"a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a"} Apr 23 13:37:25.907032 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.906920 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-z8v6z" event={"ID":"84bdeda6-716d-427e-8e8d-6a537a2ef792","Type":"ContainerDied","Data":"534c107f0c344e9d3e2f2c94f890f234e366bd0e0f8a865c30c7ec8376892e92"} Apr 23 13:37:25.907032 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.906938 2562 scope.go:117] "RemoveContainer" containerID="a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a" Apr 23 13:37:25.907957 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.907936 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" event={"ID":"7d61f14b-ffb8-4300-a5dd-1700bee88999","Type":"ContainerStarted","Data":"8463ac5c3bb61e2089ebf58be8d63bad4e89c039d8afb23f689c77cd03bb3006"} Apr 23 13:37:25.914767 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.914748 2562 scope.go:117] "RemoveContainer" containerID="a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a" Apr 23 13:37:25.915005 ip-10-0-141-22 kubenswrapper[2562]: E0423 13:37:25.914984 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a\": container with ID starting with a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a not found: ID does not exist" containerID="a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a" Apr 23 13:37:25.915085 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.915027 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a"} err="failed to get container status \"a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a\": rpc error: code = NotFound desc = could not find container \"a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a\": container with ID starting with a52ea77e12bf16981881b44dadab1c17b88b290b540090a7990e8a2ef2f1e78a not found: ID does not exist" Apr 23 13:37:25.926865 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.926843 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-z8v6z"] Apr 23 13:37:25.929920 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:25.929902 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-z8v6z"] Apr 23 13:37:26.840963 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:26.840932 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84bdeda6-716d-427e-8e8d-6a537a2ef792" path="/var/lib/kubelet/pods/84bdeda6-716d-427e-8e8d-6a537a2ef792/volumes" Apr 23 13:37:26.913817 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:26.913786 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" event={"ID":"7d61f14b-ffb8-4300-a5dd-1700bee88999","Type":"ContainerStarted","Data":"bab59851908fe4a3c97e2bb35d251093da4fe5757207f86f8f76cc76fd4ef5e5"} Apr 23 13:37:26.914174 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:26.913933 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:26.930980 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:26.930911 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" podStartSLOduration=1.512055793 podStartE2EDuration="1.930898553s" podCreationTimestamp="2026-04-23 13:37:25 +0000 UTC" firstStartedPulling="2026-04-23 13:37:25.896684236 +0000 UTC m=+331.591589483" lastFinishedPulling="2026-04-23 13:37:26.315526982 +0000 UTC m=+332.010432243" observedRunningTime="2026-04-23 13:37:26.929424268 +0000 UTC m=+332.624329535" watchObservedRunningTime="2026-04-23 13:37:26.930898553 +0000 UTC m=+332.625803819" Apr 23 13:37:57.922355 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:57.922320 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6b667fdd66-l94wm" Apr 23 13:37:58.760548 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.760518 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-nl689"] Apr 23 13:37:58.760870 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.760857 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84bdeda6-716d-427e-8e8d-6a537a2ef792" containerName="manager" Apr 23 13:37:58.760919 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.760871 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bdeda6-716d-427e-8e8d-6a537a2ef792" containerName="manager" Apr 23 13:37:58.760953 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.760918 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="84bdeda6-716d-427e-8e8d-6a537a2ef792" containerName="manager" Apr 23 13:37:58.765234 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.765212 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:37:58.767992 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.767968 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 13:37:58.768206 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.768043 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-vv9f8\"" Apr 23 13:37:58.773075 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.773054 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nl689"] Apr 23 13:37:58.776095 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.776074 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-rftpn"] Apr 23 13:37:58.779591 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.779571 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:37:58.782071 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.782054 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 13:37:58.782168 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.782114 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-6pvm9\"" Apr 23 13:37:58.787549 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.787529 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-rftpn"] Apr 23 13:37:58.863624 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.863597 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk7jg\" (UniqueName: \"kubernetes.io/projected/ec2a8664-8ea2-4eab-b577-41922fe5d7cb-kube-api-access-pk7jg\") pod \"odh-model-controller-696fc77849-rftpn\" (UID: \"ec2a8664-8ea2-4eab-b577-41922fe5d7cb\") " pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:37:58.863745 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.863641 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4llpt\" (UniqueName: \"kubernetes.io/projected/60788dfd-f0e4-46d2-b93a-e5195a0a2dcd-kube-api-access-4llpt\") pod \"model-serving-api-86f7b4b499-nl689\" (UID: \"60788dfd-f0e4-46d2-b93a-e5195a0a2dcd\") " pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:37:58.863745 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.863658 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec2a8664-8ea2-4eab-b577-41922fe5d7cb-cert\") pod \"odh-model-controller-696fc77849-rftpn\" (UID: \"ec2a8664-8ea2-4eab-b577-41922fe5d7cb\") " pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:37:58.863851 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.863766 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60788dfd-f0e4-46d2-b93a-e5195a0a2dcd-tls-certs\") pod \"model-serving-api-86f7b4b499-nl689\" (UID: \"60788dfd-f0e4-46d2-b93a-e5195a0a2dcd\") " pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:37:58.964172 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.964146 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk7jg\" (UniqueName: \"kubernetes.io/projected/ec2a8664-8ea2-4eab-b577-41922fe5d7cb-kube-api-access-pk7jg\") pod \"odh-model-controller-696fc77849-rftpn\" (UID: \"ec2a8664-8ea2-4eab-b577-41922fe5d7cb\") " pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:37:58.964507 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.964207 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4llpt\" (UniqueName: \"kubernetes.io/projected/60788dfd-f0e4-46d2-b93a-e5195a0a2dcd-kube-api-access-4llpt\") pod \"model-serving-api-86f7b4b499-nl689\" (UID: \"60788dfd-f0e4-46d2-b93a-e5195a0a2dcd\") " pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:37:58.964507 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.964238 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec2a8664-8ea2-4eab-b577-41922fe5d7cb-cert\") pod \"odh-model-controller-696fc77849-rftpn\" (UID: \"ec2a8664-8ea2-4eab-b577-41922fe5d7cb\") " pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:37:58.964507 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.964304 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60788dfd-f0e4-46d2-b93a-e5195a0a2dcd-tls-certs\") pod \"model-serving-api-86f7b4b499-nl689\" (UID: \"60788dfd-f0e4-46d2-b93a-e5195a0a2dcd\") " pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:37:58.966686 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.966662 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60788dfd-f0e4-46d2-b93a-e5195a0a2dcd-tls-certs\") pod \"model-serving-api-86f7b4b499-nl689\" (UID: \"60788dfd-f0e4-46d2-b93a-e5195a0a2dcd\") " pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:37:58.966759 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.966666 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec2a8664-8ea2-4eab-b577-41922fe5d7cb-cert\") pod \"odh-model-controller-696fc77849-rftpn\" (UID: \"ec2a8664-8ea2-4eab-b577-41922fe5d7cb\") " pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:37:58.971978 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.971954 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk7jg\" (UniqueName: \"kubernetes.io/projected/ec2a8664-8ea2-4eab-b577-41922fe5d7cb-kube-api-access-pk7jg\") pod \"odh-model-controller-696fc77849-rftpn\" (UID: \"ec2a8664-8ea2-4eab-b577-41922fe5d7cb\") " pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:37:58.972079 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:58.972038 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4llpt\" (UniqueName: \"kubernetes.io/projected/60788dfd-f0e4-46d2-b93a-e5195a0a2dcd-kube-api-access-4llpt\") pod \"model-serving-api-86f7b4b499-nl689\" (UID: \"60788dfd-f0e4-46d2-b93a-e5195a0a2dcd\") " pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:37:59.077362 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:59.077309 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:37:59.089050 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:59.089009 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:37:59.209175 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:59.205506 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nl689"] Apr 23 13:37:59.215996 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:37:59.215967 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60788dfd_f0e4_46d2_b93a_e5195a0a2dcd.slice/crio-804ff7741151278c33f191e6aa206ca6f18bd73c8d603f71c106a172224edfcb WatchSource:0}: Error finding container 804ff7741151278c33f191e6aa206ca6f18bd73c8d603f71c106a172224edfcb: Status 404 returned error can't find the container with id 804ff7741151278c33f191e6aa206ca6f18bd73c8d603f71c106a172224edfcb Apr 23 13:37:59.230905 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:37:59.230886 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-rftpn"] Apr 23 13:37:59.232671 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:37:59.232644 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2a8664_8ea2_4eab_b577_41922fe5d7cb.slice/crio-535fc60fba9d722c798111f1bb4464596d2a8247e037d2d607b1fd0b0e5e5768 WatchSource:0}: Error finding container 535fc60fba9d722c798111f1bb4464596d2a8247e037d2d607b1fd0b0e5e5768: Status 404 returned error can't find the container with id 535fc60fba9d722c798111f1bb4464596d2a8247e037d2d607b1fd0b0e5e5768 Apr 23 13:38:00.011774 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:00.011714 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nl689" event={"ID":"60788dfd-f0e4-46d2-b93a-e5195a0a2dcd","Type":"ContainerStarted","Data":"804ff7741151278c33f191e6aa206ca6f18bd73c8d603f71c106a172224edfcb"} Apr 23 13:38:00.013324 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:00.013283 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-rftpn" event={"ID":"ec2a8664-8ea2-4eab-b577-41922fe5d7cb","Type":"ContainerStarted","Data":"535fc60fba9d722c798111f1bb4464596d2a8247e037d2d607b1fd0b0e5e5768"} Apr 23 13:38:01.018210 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:01.018123 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nl689" event={"ID":"60788dfd-f0e4-46d2-b93a-e5195a0a2dcd","Type":"ContainerStarted","Data":"8a6842b75de2c0062f512e6d1a002e973f96f44ee627d8a711cb0da4ddf16e4b"} Apr 23 13:38:01.018625 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:01.018258 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:38:01.035223 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:01.035160 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-nl689" podStartSLOduration=1.551012836 podStartE2EDuration="3.03514256s" podCreationTimestamp="2026-04-23 13:37:58 +0000 UTC" firstStartedPulling="2026-04-23 13:37:59.217796587 +0000 UTC m=+364.912701836" lastFinishedPulling="2026-04-23 13:38:00.701926303 +0000 UTC m=+366.396831560" observedRunningTime="2026-04-23 13:38:01.034616047 +0000 UTC m=+366.729521315" watchObservedRunningTime="2026-04-23 13:38:01.03514256 +0000 UTC m=+366.730047828" Apr 23 13:38:03.025364 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:03.025327 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-rftpn" event={"ID":"ec2a8664-8ea2-4eab-b577-41922fe5d7cb","Type":"ContainerStarted","Data":"760b9ded0796bc320e1faa772a5056f1a194bfda58787a27bd328f5946a1575f"} Apr 23 13:38:03.025737 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:03.025541 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:38:03.046408 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:03.046362 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-rftpn" podStartSLOduration=2.219879767 podStartE2EDuration="5.046350267s" podCreationTimestamp="2026-04-23 13:37:58 +0000 UTC" firstStartedPulling="2026-04-23 13:37:59.233871862 +0000 UTC m=+364.928777106" lastFinishedPulling="2026-04-23 13:38:02.060342359 +0000 UTC m=+367.755247606" observedRunningTime="2026-04-23 13:38:03.04568455 +0000 UTC m=+368.740589815" watchObservedRunningTime="2026-04-23 13:38:03.046350267 +0000 UTC m=+368.741255582" Apr 23 13:38:12.025004 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:12.024974 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-nl689" Apr 23 13:38:14.030632 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:14.030606 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-rftpn" Apr 23 13:38:14.792641 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:14.792605 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-jqnhf"] Apr 23 13:38:14.795884 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:14.795868 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jqnhf" Apr 23 13:38:14.798579 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:14.798557 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 13:38:14.798685 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:14.798562 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qx2w\"" Apr 23 13:38:14.802173 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:14.802152 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jqnhf"] Apr 23 13:38:14.892122 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:14.892091 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hhhq\" (UniqueName: \"kubernetes.io/projected/469766ec-56bd-4bba-9d49-8a0de338a0c9-kube-api-access-6hhhq\") pod \"s3-init-jqnhf\" (UID: \"469766ec-56bd-4bba-9d49-8a0de338a0c9\") " pod="kserve/s3-init-jqnhf" Apr 23 13:38:14.992627 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:14.992603 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hhhq\" (UniqueName: \"kubernetes.io/projected/469766ec-56bd-4bba-9d49-8a0de338a0c9-kube-api-access-6hhhq\") pod \"s3-init-jqnhf\" (UID: \"469766ec-56bd-4bba-9d49-8a0de338a0c9\") " pod="kserve/s3-init-jqnhf" Apr 23 13:38:15.001329 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:15.001300 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hhhq\" (UniqueName: \"kubernetes.io/projected/469766ec-56bd-4bba-9d49-8a0de338a0c9-kube-api-access-6hhhq\") pod \"s3-init-jqnhf\" (UID: \"469766ec-56bd-4bba-9d49-8a0de338a0c9\") " pod="kserve/s3-init-jqnhf" Apr 23 13:38:15.121712 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:15.121660 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jqnhf" Apr 23 13:38:15.236778 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:15.236748 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jqnhf"] Apr 23 13:38:15.239183 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:38:15.239155 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod469766ec_56bd_4bba_9d49_8a0de338a0c9.slice/crio-fd9d415227ff810cd9d239c8f088189184d4fe266663ff5f5e616d81b8c9431a WatchSource:0}: Error finding container fd9d415227ff810cd9d239c8f088189184d4fe266663ff5f5e616d81b8c9431a: Status 404 returned error can't find the container with id fd9d415227ff810cd9d239c8f088189184d4fe266663ff5f5e616d81b8c9431a Apr 23 13:38:16.064763 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:16.064722 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jqnhf" event={"ID":"469766ec-56bd-4bba-9d49-8a0de338a0c9","Type":"ContainerStarted","Data":"fd9d415227ff810cd9d239c8f088189184d4fe266663ff5f5e616d81b8c9431a"} Apr 23 13:38:20.079144 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:20.079063 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jqnhf" event={"ID":"469766ec-56bd-4bba-9d49-8a0de338a0c9","Type":"ContainerStarted","Data":"7bddac537b8a98638815fd64a5869087acc4113e4e83601f86a3ec9d4cdd87dd"} Apr 23 13:38:20.095993 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:20.095938 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-jqnhf" podStartSLOduration=1.605849807 podStartE2EDuration="6.095920929s" podCreationTimestamp="2026-04-23 13:38:14 +0000 UTC" firstStartedPulling="2026-04-23 13:38:15.241305628 +0000 UTC m=+380.936210872" lastFinishedPulling="2026-04-23 13:38:19.731376749 +0000 UTC m=+385.426281994" observedRunningTime="2026-04-23 13:38:20.093267372 +0000 UTC m=+385.788172659" watchObservedRunningTime="2026-04-23 13:38:20.095920929 +0000 UTC m=+385.790826194" Apr 23 13:38:23.088709 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:23.088682 2562 generic.go:358] "Generic (PLEG): container finished" podID="469766ec-56bd-4bba-9d49-8a0de338a0c9" containerID="7bddac537b8a98638815fd64a5869087acc4113e4e83601f86a3ec9d4cdd87dd" exitCode=0 Apr 23 13:38:23.089099 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:23.088762 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jqnhf" event={"ID":"469766ec-56bd-4bba-9d49-8a0de338a0c9","Type":"ContainerDied","Data":"7bddac537b8a98638815fd64a5869087acc4113e4e83601f86a3ec9d4cdd87dd"} Apr 23 13:38:24.214390 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:24.214125 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jqnhf" Apr 23 13:38:24.271591 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:24.271558 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hhhq\" (UniqueName: \"kubernetes.io/projected/469766ec-56bd-4bba-9d49-8a0de338a0c9-kube-api-access-6hhhq\") pod \"469766ec-56bd-4bba-9d49-8a0de338a0c9\" (UID: \"469766ec-56bd-4bba-9d49-8a0de338a0c9\") " Apr 23 13:38:24.273568 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:24.273531 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469766ec-56bd-4bba-9d49-8a0de338a0c9-kube-api-access-6hhhq" (OuterVolumeSpecName: "kube-api-access-6hhhq") pod "469766ec-56bd-4bba-9d49-8a0de338a0c9" (UID: "469766ec-56bd-4bba-9d49-8a0de338a0c9"). InnerVolumeSpecName "kube-api-access-6hhhq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:38:24.372747 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:24.372673 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hhhq\" (UniqueName: \"kubernetes.io/projected/469766ec-56bd-4bba-9d49-8a0de338a0c9-kube-api-access-6hhhq\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:38:25.094842 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:25.094819 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jqnhf" Apr 23 13:38:25.094974 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:25.094838 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jqnhf" event={"ID":"469766ec-56bd-4bba-9d49-8a0de338a0c9","Type":"ContainerDied","Data":"fd9d415227ff810cd9d239c8f088189184d4fe266663ff5f5e616d81b8c9431a"} Apr 23 13:38:25.094974 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:38:25.094862 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd9d415227ff810cd9d239c8f088189184d4fe266663ff5f5e616d81b8c9431a" Apr 23 13:39:01.718781 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.718748 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-vjkrn"] Apr 23 13:39:01.719249 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.719091 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="469766ec-56bd-4bba-9d49-8a0de338a0c9" containerName="s3-init" Apr 23 13:39:01.719249 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.719104 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="469766ec-56bd-4bba-9d49-8a0de338a0c9" containerName="s3-init" Apr 23 13:39:01.719249 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.719166 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="469766ec-56bd-4bba-9d49-8a0de338a0c9" containerName="s3-init" Apr 23 13:39:01.752253 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.752226 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-vjkrn"] Apr 23 13:39:01.752371 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.752313 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-vjkrn" Apr 23 13:39:01.756612 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.756588 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qx2w\"" Apr 23 13:39:01.756612 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.756601 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 13:39:01.860684 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.860658 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpj2g\" (UniqueName: \"kubernetes.io/projected/dc247972-f2e6-4eeb-aabd-68cd7adb55e6-kube-api-access-wpj2g\") pod \"s3-tls-init-custom-vjkrn\" (UID: \"dc247972-f2e6-4eeb-aabd-68cd7adb55e6\") " pod="kserve/s3-tls-init-custom-vjkrn" Apr 23 13:39:01.961883 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.961856 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpj2g\" (UniqueName: \"kubernetes.io/projected/dc247972-f2e6-4eeb-aabd-68cd7adb55e6-kube-api-access-wpj2g\") pod \"s3-tls-init-custom-vjkrn\" (UID: \"dc247972-f2e6-4eeb-aabd-68cd7adb55e6\") " pod="kserve/s3-tls-init-custom-vjkrn" Apr 23 13:39:01.978300 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:01.978238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpj2g\" (UniqueName: \"kubernetes.io/projected/dc247972-f2e6-4eeb-aabd-68cd7adb55e6-kube-api-access-wpj2g\") pod \"s3-tls-init-custom-vjkrn\" (UID: \"dc247972-f2e6-4eeb-aabd-68cd7adb55e6\") " pod="kserve/s3-tls-init-custom-vjkrn" Apr 23 13:39:02.071052 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:02.071012 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-vjkrn" Apr 23 13:39:02.183447 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:02.183423 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-vjkrn"] Apr 23 13:39:02.186201 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:39:02.186169 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc247972_f2e6_4eeb_aabd_68cd7adb55e6.slice/crio-1b6f3ffd37322b67f5898dbd5cf00513d3de79ccc68f3917a7ec1afe7c0d7285 WatchSource:0}: Error finding container 1b6f3ffd37322b67f5898dbd5cf00513d3de79ccc68f3917a7ec1afe7c0d7285: Status 404 returned error can't find the container with id 1b6f3ffd37322b67f5898dbd5cf00513d3de79ccc68f3917a7ec1afe7c0d7285 Apr 23 13:39:02.202617 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:02.202590 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-vjkrn" event={"ID":"dc247972-f2e6-4eeb-aabd-68cd7adb55e6","Type":"ContainerStarted","Data":"1b6f3ffd37322b67f5898dbd5cf00513d3de79ccc68f3917a7ec1afe7c0d7285"} Apr 23 13:39:03.207464 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:03.207430 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-vjkrn" event={"ID":"dc247972-f2e6-4eeb-aabd-68cd7adb55e6","Type":"ContainerStarted","Data":"e32f81274849eb99e87c369b4e5cee597abe19621f8e2f604ad2f185d0175761"} Apr 23 13:39:03.226754 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:03.226708 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-vjkrn" podStartSLOduration=2.226693144 podStartE2EDuration="2.226693144s" podCreationTimestamp="2026-04-23 13:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:39:03.224663174 +0000 UTC m=+428.919568439" watchObservedRunningTime="2026-04-23 13:39:03.226693144 +0000 UTC m=+428.921598410" Apr 23 13:39:08.224492 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:08.224455 2562 generic.go:358] "Generic (PLEG): container finished" podID="dc247972-f2e6-4eeb-aabd-68cd7adb55e6" containerID="e32f81274849eb99e87c369b4e5cee597abe19621f8e2f604ad2f185d0175761" exitCode=0 Apr 23 13:39:08.224896 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:08.224536 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-vjkrn" event={"ID":"dc247972-f2e6-4eeb-aabd-68cd7adb55e6","Type":"ContainerDied","Data":"e32f81274849eb99e87c369b4e5cee597abe19621f8e2f604ad2f185d0175761"} Apr 23 13:39:09.349488 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:09.349466 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-vjkrn" Apr 23 13:39:09.522695 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:09.522633 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpj2g\" (UniqueName: \"kubernetes.io/projected/dc247972-f2e6-4eeb-aabd-68cd7adb55e6-kube-api-access-wpj2g\") pod \"dc247972-f2e6-4eeb-aabd-68cd7adb55e6\" (UID: \"dc247972-f2e6-4eeb-aabd-68cd7adb55e6\") " Apr 23 13:39:09.524680 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:09.524657 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc247972-f2e6-4eeb-aabd-68cd7adb55e6-kube-api-access-wpj2g" (OuterVolumeSpecName: "kube-api-access-wpj2g") pod "dc247972-f2e6-4eeb-aabd-68cd7adb55e6" (UID: "dc247972-f2e6-4eeb-aabd-68cd7adb55e6"). InnerVolumeSpecName "kube-api-access-wpj2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:39:09.623836 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:09.623811 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wpj2g\" (UniqueName: \"kubernetes.io/projected/dc247972-f2e6-4eeb-aabd-68cd7adb55e6-kube-api-access-wpj2g\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:39:10.234125 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:10.234089 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-vjkrn" event={"ID":"dc247972-f2e6-4eeb-aabd-68cd7adb55e6","Type":"ContainerDied","Data":"1b6f3ffd37322b67f5898dbd5cf00513d3de79ccc68f3917a7ec1afe7c0d7285"} Apr 23 13:39:10.234125 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:10.234122 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b6f3ffd37322b67f5898dbd5cf00513d3de79ccc68f3917a7ec1afe7c0d7285" Apr 23 13:39:10.234326 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:10.234133 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-vjkrn" Apr 23 13:39:12.639588 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.639550 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-xq5dk"] Apr 23 13:39:12.639972 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.639911 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc247972-f2e6-4eeb-aabd-68cd7adb55e6" containerName="s3-tls-init-custom" Apr 23 13:39:12.639972 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.639924 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc247972-f2e6-4eeb-aabd-68cd7adb55e6" containerName="s3-tls-init-custom" Apr 23 13:39:12.640076 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.639994 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc247972-f2e6-4eeb-aabd-68cd7adb55e6" containerName="s3-tls-init-custom" Apr 23 13:39:12.643133 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.643114 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-xq5dk" Apr 23 13:39:12.645744 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.645713 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 23 13:39:12.645868 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.645789 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qx2w\"" Apr 23 13:39:12.648267 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.648234 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b8qm\" (UniqueName: \"kubernetes.io/projected/404476b4-5ccf-4e66-9373-8eb9ecb00511-kube-api-access-2b8qm\") pod \"s3-tls-init-serving-xq5dk\" (UID: \"404476b4-5ccf-4e66-9373-8eb9ecb00511\") " pod="kserve/s3-tls-init-serving-xq5dk" Apr 23 13:39:12.649384 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.649351 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-xq5dk"] Apr 23 13:39:12.748774 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.748752 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b8qm\" (UniqueName: \"kubernetes.io/projected/404476b4-5ccf-4e66-9373-8eb9ecb00511-kube-api-access-2b8qm\") pod \"s3-tls-init-serving-xq5dk\" (UID: \"404476b4-5ccf-4e66-9373-8eb9ecb00511\") " pod="kserve/s3-tls-init-serving-xq5dk" Apr 23 13:39:12.756856 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.756826 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b8qm\" (UniqueName: \"kubernetes.io/projected/404476b4-5ccf-4e66-9373-8eb9ecb00511-kube-api-access-2b8qm\") pod \"s3-tls-init-serving-xq5dk\" (UID: \"404476b4-5ccf-4e66-9373-8eb9ecb00511\") " pod="kserve/s3-tls-init-serving-xq5dk" Apr 23 13:39:12.966443 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:12.966411 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-xq5dk" Apr 23 13:39:13.078997 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:13.078960 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-xq5dk"] Apr 23 13:39:13.081887 ip-10-0-141-22 kubenswrapper[2562]: W0423 13:39:13.081858 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod404476b4_5ccf_4e66_9373_8eb9ecb00511.slice/crio-a68ae22e87341d6ce448badba4cf5ff252b5b82945bf413a9985854460b0f89f WatchSource:0}: Error finding container a68ae22e87341d6ce448badba4cf5ff252b5b82945bf413a9985854460b0f89f: Status 404 returned error can't find the container with id a68ae22e87341d6ce448badba4cf5ff252b5b82945bf413a9985854460b0f89f Apr 23 13:39:13.244619 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:13.244547 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-xq5dk" event={"ID":"404476b4-5ccf-4e66-9373-8eb9ecb00511","Type":"ContainerStarted","Data":"5c4911e979c15cc6aba5410081ace03053e54fdb3e991fe0946105c0c297cee9"} Apr 23 13:39:13.244619 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:13.244586 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-xq5dk" event={"ID":"404476b4-5ccf-4e66-9373-8eb9ecb00511","Type":"ContainerStarted","Data":"a68ae22e87341d6ce448badba4cf5ff252b5b82945bf413a9985854460b0f89f"} Apr 23 13:39:13.262188 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:13.262141 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-xq5dk" podStartSLOduration=1.2621219909999999 podStartE2EDuration="1.262121991s" podCreationTimestamp="2026-04-23 13:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:39:13.25919723 +0000 UTC m=+438.954102496" watchObservedRunningTime="2026-04-23 13:39:13.262121991 +0000 UTC m=+438.957027260" Apr 23 13:39:17.257104 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:17.257075 2562 generic.go:358] "Generic (PLEG): container finished" podID="404476b4-5ccf-4e66-9373-8eb9ecb00511" containerID="5c4911e979c15cc6aba5410081ace03053e54fdb3e991fe0946105c0c297cee9" exitCode=0 Apr 23 13:39:17.257400 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:17.257123 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-xq5dk" event={"ID":"404476b4-5ccf-4e66-9373-8eb9ecb00511","Type":"ContainerDied","Data":"5c4911e979c15cc6aba5410081ace03053e54fdb3e991fe0946105c0c297cee9"} Apr 23 13:39:18.381345 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:18.381316 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-xq5dk" Apr 23 13:39:18.383637 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:18.383620 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b8qm\" (UniqueName: \"kubernetes.io/projected/404476b4-5ccf-4e66-9373-8eb9ecb00511-kube-api-access-2b8qm\") pod \"404476b4-5ccf-4e66-9373-8eb9ecb00511\" (UID: \"404476b4-5ccf-4e66-9373-8eb9ecb00511\") " Apr 23 13:39:18.385453 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:18.385425 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/404476b4-5ccf-4e66-9373-8eb9ecb00511-kube-api-access-2b8qm" (OuterVolumeSpecName: "kube-api-access-2b8qm") pod "404476b4-5ccf-4e66-9373-8eb9ecb00511" (UID: "404476b4-5ccf-4e66-9373-8eb9ecb00511"). InnerVolumeSpecName "kube-api-access-2b8qm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:39:18.484106 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:18.484080 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2b8qm\" (UniqueName: \"kubernetes.io/projected/404476b4-5ccf-4e66-9373-8eb9ecb00511-kube-api-access-2b8qm\") on node \"ip-10-0-141-22.ec2.internal\" DevicePath \"\"" Apr 23 13:39:19.264183 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:19.264153 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-xq5dk" Apr 23 13:39:19.264336 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:19.264177 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-xq5dk" event={"ID":"404476b4-5ccf-4e66-9373-8eb9ecb00511","Type":"ContainerDied","Data":"a68ae22e87341d6ce448badba4cf5ff252b5b82945bf413a9985854460b0f89f"} Apr 23 13:39:19.264336 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:39:19.264209 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a68ae22e87341d6ce448badba4cf5ff252b5b82945bf413a9985854460b0f89f" Apr 23 13:41:54.749391 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:41:54.749364 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:41:54.749990 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:41:54.749973 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:46:54.773597 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:46:54.773562 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:46:54.774160 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:46:54.773867 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:51:54.798149 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:51:54.798120 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:51:54.798627 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:51:54.798485 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:56:54.821732 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:56:54.821704 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 13:56:54.822337 ip-10-0-141-22 kubenswrapper[2562]: I0423 13:56:54.822127 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:01:54.846123 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:01:54.846091 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:01:54.847128 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:01:54.847106 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:06:54.872071 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:06:54.872041 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:06:54.873255 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:06:54.873237 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:11:54.895064 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:11:54.895037 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:11:54.897625 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:11:54.896652 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:16:54.922634 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:16:54.922604 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:16:54.925747 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:16:54.925722 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:21:54.946914 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:21:54.946817 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:21:54.952238 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:21:54.952219 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:26:54.969377 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:26:54.969275 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:26:54.975272 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:26:54.975254 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:31:54.992282 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:31:54.992180 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:31:54.998709 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:31:54.998688 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:36:45.313498 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.313461 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-66tjc/must-gather-vzvcw"] Apr 23 14:36:45.314116 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.314042 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="404476b4-5ccf-4e66-9373-8eb9ecb00511" containerName="s3-tls-init-serving" Apr 23 14:36:45.314116 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.314063 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="404476b4-5ccf-4e66-9373-8eb9ecb00511" containerName="s3-tls-init-serving" Apr 23 14:36:45.314258 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.314170 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="404476b4-5ccf-4e66-9373-8eb9ecb00511" containerName="s3-tls-init-serving" Apr 23 14:36:45.317363 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.317342 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-66tjc/must-gather-vzvcw" Apr 23 14:36:45.320231 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.320210 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-66tjc\"/\"kube-root-ca.crt\"" Apr 23 14:36:45.320231 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.320220 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-66tjc\"/\"openshift-service-ca.crt\"" Apr 23 14:36:45.320380 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.320237 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-66tjc\"/\"default-dockercfg-6hdlv\"" Apr 23 14:36:45.323535 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.323515 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-66tjc/must-gather-vzvcw"] Apr 23 14:36:45.371572 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.371540 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmlp\" (UniqueName: \"kubernetes.io/projected/62420132-f213-46e7-b5e4-a1d27053e345-kube-api-access-jwmlp\") pod \"must-gather-vzvcw\" (UID: \"62420132-f213-46e7-b5e4-a1d27053e345\") " pod="openshift-must-gather-66tjc/must-gather-vzvcw" Apr 23 14:36:45.371685 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.371577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62420132-f213-46e7-b5e4-a1d27053e345-must-gather-output\") pod \"must-gather-vzvcw\" (UID: \"62420132-f213-46e7-b5e4-a1d27053e345\") " pod="openshift-must-gather-66tjc/must-gather-vzvcw" Apr 23 14:36:45.472542 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.472511 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmlp\" (UniqueName: \"kubernetes.io/projected/62420132-f213-46e7-b5e4-a1d27053e345-kube-api-access-jwmlp\") pod \"must-gather-vzvcw\" (UID: \"62420132-f213-46e7-b5e4-a1d27053e345\") " pod="openshift-must-gather-66tjc/must-gather-vzvcw" Apr 23 14:36:45.472695 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.472545 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62420132-f213-46e7-b5e4-a1d27053e345-must-gather-output\") pod \"must-gather-vzvcw\" (UID: \"62420132-f213-46e7-b5e4-a1d27053e345\") " pod="openshift-must-gather-66tjc/must-gather-vzvcw" Apr 23 14:36:45.472878 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.472858 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62420132-f213-46e7-b5e4-a1d27053e345-must-gather-output\") pod \"must-gather-vzvcw\" (UID: \"62420132-f213-46e7-b5e4-a1d27053e345\") " pod="openshift-must-gather-66tjc/must-gather-vzvcw" Apr 23 14:36:45.481706 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.481679 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmlp\" (UniqueName: \"kubernetes.io/projected/62420132-f213-46e7-b5e4-a1d27053e345-kube-api-access-jwmlp\") pod \"must-gather-vzvcw\" (UID: \"62420132-f213-46e7-b5e4-a1d27053e345\") " pod="openshift-must-gather-66tjc/must-gather-vzvcw" Apr 23 14:36:45.640473 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.640355 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-66tjc/must-gather-vzvcw" Apr 23 14:36:45.756725 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.756700 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-66tjc/must-gather-vzvcw"] Apr 23 14:36:45.758914 ip-10-0-141-22 kubenswrapper[2562]: W0423 14:36:45.758886 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62420132_f213_46e7_b5e4_a1d27053e345.slice/crio-a4cc2512fbfb4fe437bac3b7530d5bddc2e091e467183b5f6d4fd4b12e6c940d WatchSource:0}: Error finding container a4cc2512fbfb4fe437bac3b7530d5bddc2e091e467183b5f6d4fd4b12e6c940d: Status 404 returned error can't find the container with id a4cc2512fbfb4fe437bac3b7530d5bddc2e091e467183b5f6d4fd4b12e6c940d Apr 23 14:36:45.760789 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:45.760768 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:36:46.473062 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:46.473025 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/must-gather-vzvcw" event={"ID":"62420132-f213-46e7-b5e4-a1d27053e345","Type":"ContainerStarted","Data":"a4cc2512fbfb4fe437bac3b7530d5bddc2e091e467183b5f6d4fd4b12e6c940d"} Apr 23 14:36:47.479127 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:47.479089 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/must-gather-vzvcw" event={"ID":"62420132-f213-46e7-b5e4-a1d27053e345","Type":"ContainerStarted","Data":"6a5980fe61e7db02047a6b5a92fae94041b1d3fd3db2e99d5ffafbc017de422a"} Apr 23 14:36:47.479629 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:47.479606 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/must-gather-vzvcw" event={"ID":"62420132-f213-46e7-b5e4-a1d27053e345","Type":"ContainerStarted","Data":"da9ece0d7451339f539334f0978cbc18158ff830dad0a7e969752d35d6efc1f1"} Apr 23 14:36:47.496843 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:47.496797 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-66tjc/must-gather-vzvcw" podStartSLOduration=1.7518745359999999 podStartE2EDuration="2.496781932s" podCreationTimestamp="2026-04-23 14:36:45 +0000 UTC" firstStartedPulling="2026-04-23 14:36:45.760906312 +0000 UTC m=+3891.455811556" lastFinishedPulling="2026-04-23 14:36:46.505813705 +0000 UTC m=+3892.200718952" observedRunningTime="2026-04-23 14:36:47.495061083 +0000 UTC m=+3893.189966351" watchObservedRunningTime="2026-04-23 14:36:47.496781932 +0000 UTC m=+3893.191687197" Apr 23 14:36:47.874900 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:47.874825 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7hkfb_133e0188-47e1-4d65-ae44-5988ea3a3fe8/global-pull-secret-syncer/0.log" Apr 23 14:36:48.065346 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:48.065312 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hv7lz_d5b9a202-5bbf-447f-828a-8504cdc5749e/konnectivity-agent/0.log" Apr 23 14:36:48.169382 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:48.169350 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-22.ec2.internal_dc1728671f1199487e42b58400f18934/haproxy/0.log" Apr 23 14:36:51.597772 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.597677 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b9e094f-6f9f-4001-aadf-4b45daf3c0fa/alertmanager/0.log" Apr 23 14:36:51.624720 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.624688 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b9e094f-6f9f-4001-aadf-4b45daf3c0fa/config-reloader/0.log" Apr 23 14:36:51.655066 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.651482 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b9e094f-6f9f-4001-aadf-4b45daf3c0fa/kube-rbac-proxy-web/0.log" Apr 23 14:36:51.688253 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.688221 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b9e094f-6f9f-4001-aadf-4b45daf3c0fa/kube-rbac-proxy/0.log" Apr 23 14:36:51.712246 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.712215 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b9e094f-6f9f-4001-aadf-4b45daf3c0fa/kube-rbac-proxy-metric/0.log" Apr 23 14:36:51.734792 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.734756 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b9e094f-6f9f-4001-aadf-4b45daf3c0fa/prom-label-proxy/0.log" Apr 23 14:36:51.758295 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.758262 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4b9e094f-6f9f-4001-aadf-4b45daf3c0fa/init-config-reloader/0.log" Apr 23 14:36:51.824085 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.824056 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vs52_7f81a213-c585-418c-a4c7-d571e37e829d/kube-state-metrics/0.log" Apr 23 14:36:51.858334 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.858261 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vs52_7f81a213-c585-418c-a4c7-d571e37e829d/kube-rbac-proxy-main/0.log" Apr 23 14:36:51.901999 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.901969 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vs52_7f81a213-c585-418c-a4c7-d571e37e829d/kube-rbac-proxy-self/0.log" Apr 23 14:36:51.935842 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:51.935816 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6f64b665bd-xtqts_3f1cf3ec-73d0-491f-a5cd-95d17a7fac99/metrics-server/0.log" Apr 23 14:36:52.072563 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.072535 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mmtpr_b3fc0734-b575-4b05-b44c-8457c8db77d5/node-exporter/0.log" Apr 23 14:36:52.094505 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.094476 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mmtpr_b3fc0734-b575-4b05-b44c-8457c8db77d5/kube-rbac-proxy/0.log" Apr 23 14:36:52.115154 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.115089 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mmtpr_b3fc0734-b575-4b05-b44c-8457c8db77d5/init-textfile/0.log" Apr 23 14:36:52.210864 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.210834 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8vwjk_e291a639-f923-4a04-9d39-8c585ab8e111/kube-rbac-proxy-main/0.log" Apr 23 14:36:52.232058 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.232009 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8vwjk_e291a639-f923-4a04-9d39-8c585ab8e111/kube-rbac-proxy-self/0.log" Apr 23 14:36:52.256043 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.255995 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8vwjk_e291a639-f923-4a04-9d39-8c585ab8e111/openshift-state-metrics/0.log" Apr 23 14:36:52.311502 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.311471 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1da43892-0eae-4a1d-ade8-6a928a990187/prometheus/0.log" Apr 23 14:36:52.327529 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.327501 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1da43892-0eae-4a1d-ade8-6a928a990187/config-reloader/0.log" Apr 23 14:36:52.347468 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.347435 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1da43892-0eae-4a1d-ade8-6a928a990187/thanos-sidecar/0.log" Apr 23 14:36:52.368396 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.368319 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1da43892-0eae-4a1d-ade8-6a928a990187/kube-rbac-proxy-web/0.log" Apr 23 14:36:52.391035 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.390993 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1da43892-0eae-4a1d-ade8-6a928a990187/kube-rbac-proxy/0.log" Apr 23 14:36:52.412318 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.412287 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1da43892-0eae-4a1d-ade8-6a928a990187/kube-rbac-proxy-thanos/0.log" Apr 23 14:36:52.433346 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.433322 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1da43892-0eae-4a1d-ade8-6a928a990187/init-config-reloader/0.log" Apr 23 14:36:52.463606 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.463572 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-5ntk5_b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb/prometheus-operator/0.log" Apr 23 14:36:52.479218 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.479188 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-5ntk5_b81ccbe5-04d0-48c5-aa20-f3c2386ffeeb/kube-rbac-proxy/0.log" Apr 23 14:36:52.531439 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.531411 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59fbd9859d-xsx5t_d42a4682-f0df-43ed-8498-91f164416584/telemeter-client/0.log" Apr 23 14:36:52.553754 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.553715 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59fbd9859d-xsx5t_d42a4682-f0df-43ed-8498-91f164416584/reload/0.log" Apr 23 14:36:52.576260 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.576213 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59fbd9859d-xsx5t_d42a4682-f0df-43ed-8498-91f164416584/kube-rbac-proxy/0.log" Apr 23 14:36:52.603922 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.603894 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d9456954-qwsqf_b8501b35-5e05-402b-b003-7a7a5e9a5a84/thanos-query/0.log" Apr 23 14:36:52.624661 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.624633 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d9456954-qwsqf_b8501b35-5e05-402b-b003-7a7a5e9a5a84/kube-rbac-proxy-web/0.log" Apr 23 14:36:52.648242 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.648205 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d9456954-qwsqf_b8501b35-5e05-402b-b003-7a7a5e9a5a84/kube-rbac-proxy/0.log" Apr 23 14:36:52.676907 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.676816 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d9456954-qwsqf_b8501b35-5e05-402b-b003-7a7a5e9a5a84/prom-label-proxy/0.log" Apr 23 14:36:52.700118 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.700090 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d9456954-qwsqf_b8501b35-5e05-402b-b003-7a7a5e9a5a84/kube-rbac-proxy-rules/0.log" Apr 23 14:36:52.722320 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:52.722291 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-75d9456954-qwsqf_b8501b35-5e05-402b-b003-7a7a5e9a5a84/kube-rbac-proxy-metrics/0.log" Apr 23 14:36:54.946578 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:54.946533 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt"] Apr 23 14:36:54.953273 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:54.953246 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:54.955946 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:54.955917 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt"] Apr 23 14:36:55.022071 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.021931 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:36:55.032811 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.031238 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:36:55.058827 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.058801 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-lib-modules\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.058948 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.058848 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-proc\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.058948 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.058926 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-sys\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.059062 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.058952 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-podres\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.059062 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.058968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnll\" (UniqueName: \"kubernetes.io/projected/7d317961-2017-4663-b183-cc2894f30c16-kube-api-access-nfnll\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160100 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160059 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-proc\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160326 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160296 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-proc\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160492 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160468 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-sys\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160611 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160410 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-sys\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160699 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160641 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-podres\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160699 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160673 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnll\" (UniqueName: \"kubernetes.io/projected/7d317961-2017-4663-b183-cc2894f30c16-kube-api-access-nfnll\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160797 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160739 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-podres\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160797 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160752 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-lib-modules\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.160867 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.160816 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d317961-2017-4663-b183-cc2894f30c16-lib-modules\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.171803 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.171781 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnll\" (UniqueName: \"kubernetes.io/projected/7d317961-2017-4663-b183-cc2894f30c16-kube-api-access-nfnll\") pod \"perf-node-gather-daemonset-h2tjt\" (UID: \"7d317961-2017-4663-b183-cc2894f30c16\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.266410 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.266320 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:55.600726 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.600639 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt"] Apr 23 14:36:55.604462 ip-10-0-141-22 kubenswrapper[2562]: W0423 14:36:55.604397 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7d317961_2017_4663_b183_cc2894f30c16.slice/crio-466776fa5b1d5c02b6a1a75699f0aa7dfc4fb1eb1083be830859928854bb26b4 WatchSource:0}: Error finding container 466776fa5b1d5c02b6a1a75699f0aa7dfc4fb1eb1083be830859928854bb26b4: Status 404 returned error can't find the container with id 466776fa5b1d5c02b6a1a75699f0aa7dfc4fb1eb1083be830859928854bb26b4 Apr 23 14:36:55.949153 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.949123 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9b4c5_3af26c35-b8fa-492e-89df-4d39fa887de9/dns/0.log" Apr 23 14:36:55.967592 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:55.967570 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9b4c5_3af26c35-b8fa-492e-89df-4d39fa887de9/kube-rbac-proxy/0.log" Apr 23 14:36:56.039789 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:56.039764 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4mlb8_eb69ff46-2dee-4fb6-ad10-74074668a10f/dns-node-resolver/0.log" Apr 23 14:36:56.514103 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:56.514066 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" event={"ID":"7d317961-2017-4663-b183-cc2894f30c16","Type":"ContainerStarted","Data":"0b274235e07354a5be1d15ddb44a884f182400d32b8ee5984e6d4891317c406b"} Apr 23 14:36:56.514103 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:56.514111 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" event={"ID":"7d317961-2017-4663-b183-cc2894f30c16","Type":"ContainerStarted","Data":"466776fa5b1d5c02b6a1a75699f0aa7dfc4fb1eb1083be830859928854bb26b4"} Apr 23 14:36:56.514366 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:56.514148 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:36:56.521123 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:56.521098 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hc6xl_30d3d180-9aae-49e3-9c8b-f13ce3df5f68/node-ca/0.log" Apr 23 14:36:56.531722 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:56.531681 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" podStartSLOduration=2.531669783 podStartE2EDuration="2.531669783s" podCreationTimestamp="2026-04-23 14:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:36:56.53074908 +0000 UTC m=+3902.225654351" watchObservedRunningTime="2026-04-23 14:36:56.531669783 +0000 UTC m=+3902.226575049" Apr 23 14:36:57.601240 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:57.601207 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ljjs8_0c46498b-5ca7-4562-8e0f-dc82cd5bb6ce/serve-healthcheck-canary/0.log" Apr 23 14:36:58.042341 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:58.042310 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wb5pq_cb4ae027-c824-44ff-bf3c-22c33097e46b/kube-rbac-proxy/0.log" Apr 23 14:36:58.062477 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:58.062455 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wb5pq_cb4ae027-c824-44ff-bf3c-22c33097e46b/exporter/0.log" Apr 23 14:36:58.082600 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:36:58.082579 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wb5pq_cb4ae027-c824-44ff-bf3c-22c33097e46b/extractor/0.log" Apr 23 14:37:00.213431 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:00.213396 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6b667fdd66-l94wm_7d61f14b-ffb8-4300-a5dd-1700bee88999/manager/0.log" Apr 23 14:37:00.252970 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:00.252937 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-nl689_60788dfd-f0e4-46d2-b93a-e5195a0a2dcd/server/0.log" Apr 23 14:37:00.686767 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:00.686723 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-rftpn_ec2a8664-8ea2-4eab-b577-41922fe5d7cb/manager/0.log" Apr 23 14:37:00.707397 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:00.707363 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-jqnhf_469766ec-56bd-4bba-9d49-8a0de338a0c9/s3-init/0.log" Apr 23 14:37:00.730636 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:00.730612 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-vjkrn_dc247972-f2e6-4eeb-aabd-68cd7adb55e6/s3-tls-init-custom/0.log" Apr 23 14:37:00.751252 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:00.751226 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-xq5dk_404476b4-5ccf-4e66-9373-8eb9ecb00511/s3-tls-init-serving/0.log" Apr 23 14:37:02.527339 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:02.527311 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-h2tjt" Apr 23 14:37:06.277759 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:06.277734 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87dsw_db736963-96fd-4537-b82e-5a28f2543a84/kube-multus-additional-cni-plugins/0.log" Apr 23 14:37:06.300355 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:06.300328 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87dsw_db736963-96fd-4537-b82e-5a28f2543a84/egress-router-binary-copy/0.log" Apr 23 14:37:06.320552 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:06.320525 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87dsw_db736963-96fd-4537-b82e-5a28f2543a84/cni-plugins/0.log" Apr 23 14:37:06.338931 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:06.338877 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87dsw_db736963-96fd-4537-b82e-5a28f2543a84/bond-cni-plugin/0.log" Apr 23 14:37:06.358193 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:06.358168 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87dsw_db736963-96fd-4537-b82e-5a28f2543a84/routeoverride-cni/0.log" Apr 23 14:37:06.378649 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:06.378627 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87dsw_db736963-96fd-4537-b82e-5a28f2543a84/whereabouts-cni-bincopy/0.log" Apr 23 14:37:06.400837 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:06.400806 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87dsw_db736963-96fd-4537-b82e-5a28f2543a84/whereabouts-cni/0.log" Apr 23 14:37:06.867967 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:06.867857 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjz54_3ab46110-c05e-4dde-9c0e-2a035e761a4a/kube-multus/0.log" Apr 23 14:37:07.013941 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.013895 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kdj59_8372a373-96b3-40a7-a175-86077c4b2030/network-metrics-daemon/0.log" Apr 23 14:37:07.034594 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.034571 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kdj59_8372a373-96b3-40a7-a175-86077c4b2030/kube-rbac-proxy/0.log" Apr 23 14:37:07.751984 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.751949 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-controller/0.log" Apr 23 14:37:07.767637 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.767608 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/0.log" Apr 23 14:37:07.803439 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.803410 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovn-acl-logging/1.log" Apr 23 14:37:07.820740 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.820720 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/kube-rbac-proxy-node/0.log" Apr 23 14:37:07.843253 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.843234 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 14:37:07.861197 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.861174 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/northd/0.log" Apr 23 14:37:07.880367 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.880346 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/nbdb/0.log" Apr 23 14:37:07.901037 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:07.901001 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/sbdb/0.log" Apr 23 14:37:08.087099 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:08.087034 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmczc_3d337b45-35c2-42c7-a28e-1498d3ec882d/ovnkube-controller/0.log" Apr 23 14:37:09.700416 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:09.700386 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xmjsn_2a1fbae2-2df7-41eb-9ed9-aac09b5af692/network-check-target-container/0.log" Apr 23 14:37:10.601499 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:10.601471 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-djgmc_ce8caef2-3b61-4f26-ab2b-c0770c8d1569/iptables-alerter/0.log" Apr 23 14:37:11.299150 ip-10-0-141-22 kubenswrapper[2562]: I0423 14:37:11.299121 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-xgv6v_cbc3279c-8519-4d64-887e-5441f89c8b3d/tuned/0.log"