Apr 16 16:26:59.591508 ip-10-0-132-191 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:26:59.591518 ip-10-0-132-191 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:26:59.591524 ip-10-0-132-191 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:26:59.591760 ip-10-0-132-191 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:27:09.821716 ip-10-0-132-191 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:27:09.821738 ip-10-0-132-191 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 5899a4b6544c46358abd3b337453410b -- Apr 16 16:29:39.933769 ip-10-0-132-191 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:29:40.391530 ip-10-0-132-191 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:29:40.391530 ip-10-0-132-191 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:29:40.391530 ip-10-0-132-191 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:29:40.391530 ip-10-0-132-191 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:29:40.391530 ip-10-0-132-191 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:29:40.393302 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.393216 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:29:40.396222 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396206 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:40.396222 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396221 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396225 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396229 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396232 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396235 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396238 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396241 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396244 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396247 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396249 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396252 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396254 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396257 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396260 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396262 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396265 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396267 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396270 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396273 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396275 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:40.396286 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396278 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396281 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396284 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396290 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396293 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396296 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396299 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396302 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396304 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396307 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396309 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396312 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396314 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396317 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396319 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396322 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396324 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396327 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396330 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:40.396780 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396332 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396335 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396337 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396340 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396343 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396346 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396349 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396351 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396353 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396356 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396359 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396361 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396365 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396369 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396372 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396376 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396379 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396381 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396384 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396386 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:40.397273 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396389 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396391 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396394 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396396 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396399 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396402 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396404 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396407 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396411 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396414 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396417 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396420 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396424 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396427 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396446 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396450 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396454 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396458 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396463 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:40.397772 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396466 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396468 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396471 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396474 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396476 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396479 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396481 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396850 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396856 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396859 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396862 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396865 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396868 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396870 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396873 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396876 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396878 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396881 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396883 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396886 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:40.398225 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396889 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396891 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396894 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396897 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396899 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396902 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396905 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396908 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396910 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396913 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396915 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396918 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396921 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396923 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396926 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396928 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396931 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396934 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396936 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:40.398742 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396939 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396942 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396944 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396947 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396950 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396952 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396955 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396957 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396960 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396962 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396964 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396968 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396970 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396973 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396975 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396978 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396981 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396984 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396986 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:40.399267 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396989 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396992 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396995 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.396997 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397000 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397002 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397005 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397008 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397013 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397016 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397018 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397021 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397023 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397026 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397029 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397031 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397034 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397036 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397039 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397042 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:40.399741 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397044 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397047 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397049 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397052 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397054 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397057 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397059 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397063 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397065 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397068 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397070 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397074 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397078 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397082 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397086 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397161 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397168 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397174 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397179 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397183 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397187 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:29:40.400233 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397191 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397195 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397198 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397202 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397206 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397210 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397214 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397217 2569 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397220 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397224 2569 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397227 2569 flags.go:64] FLAG: --cloud-config="" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397230 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397233 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397236 2569 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397239 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397242 2569 flags.go:64] FLAG: --config-dir="" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397245 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397249 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397253 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397256 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397259 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397262 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397265 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397269 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:29:40.400758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397272 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397276 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397279 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397284 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397287 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397290 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397293 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397296 2569 flags.go:64] FLAG: --enable-server="true" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397302 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397306 2569 flags.go:64] FLAG: --event-burst="100" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397309 2569 flags.go:64] FLAG: --event-qps="50" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397312 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397315 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397319 2569 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397322 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397326 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397329 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397332 2569 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397334 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397337 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397340 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397343 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397347 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397350 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397352 2569 flags.go:64] FLAG: --feature-gates="" Apr 16 16:29:40.401368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397356 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397359 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397362 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397366 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397369 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397372 2569 flags.go:64] FLAG: --help="false" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397375 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397378 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397381 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397385 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397388 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397391 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397394 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397397 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397400 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397405 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397407 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397411 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397414 2569 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397416 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397419 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397422 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397425 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397441 2569 flags.go:64] FLAG: --lock-file="" Apr 16 16:29:40.401993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397445 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397448 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397451 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397456 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397459 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397462 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397465 2569 flags.go:64] FLAG: --logging-format="text" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397468 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397472 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397475 2569 flags.go:64] FLAG: --manifest-url="" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397477 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397482 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397485 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397489 2569 flags.go:64] FLAG: --max-pods="110" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397493 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397496 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397499 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397502 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397505 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397509 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397512 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397520 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397524 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397529 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:29:40.402605 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397532 2569 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397535 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397541 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397544 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397547 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397550 2569 flags.go:64] FLAG: --port="10250" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397553 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397556 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e0bdcc48909bee49" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397560 2569 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397563 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397566 2569 flags.go:64] FLAG: --register-node="true" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397569 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397572 2569 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397575 2569 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397578 2569 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397581 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397584 2569 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397588 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397590 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397593 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397596 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397599 2569 flags.go:64] FLAG: --runonce="false" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397602 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397607 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397610 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:29:40.403226 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397613 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397616 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397619 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397622 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397626 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397629 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397633 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397636 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397639 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397642 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397645 2569 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397648 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397653 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397656 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397659 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397662 2569 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397665 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397668 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397671 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397673 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397676 2569 flags.go:64] FLAG: --v="2" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397681 2569 flags.go:64] FLAG: --version="false" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397685 2569 flags.go:64] FLAG: --vmodule="" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397694 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.397697 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:29:40.403869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397787 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397790 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397794 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397797 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397800 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397804 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397807 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397810 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397813 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397816 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397819 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397821 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397824 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397829 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397831 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397834 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397836 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397839 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397841 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397844 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:40.404518 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397846 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397849 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397852 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397855 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397857 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397860 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397862 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397865 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397868 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397870 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397874 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397878 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397881 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397884 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397887 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397890 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397892 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397897 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397899 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:40.405010 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397902 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397905 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397908 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397911 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397913 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397916 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397920 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397922 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397925 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397927 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397930 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397932 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397935 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397937 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397940 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397942 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397945 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397947 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397950 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397953 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:40.405502 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397955 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397958 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397961 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397963 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397966 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397968 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397972 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397975 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397978 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397980 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397984 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397987 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397989 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397992 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397994 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397997 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.397999 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398002 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398006 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398009 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:40.405998 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398011 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398014 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398017 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398019 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398022 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398024 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.398027 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.398032 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.404326 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.404344 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404395 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404401 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404405 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404408 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404411 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404414 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:40.406508 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404417 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404419 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404422 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404425 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404428 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404446 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404449 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404452 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404455 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404457 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404460 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404463 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404466 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404469 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404472 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404474 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404477 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404480 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404482 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404486 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:40.406916 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404490 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404494 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404496 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404499 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404502 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404505 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404508 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404511 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404514 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404517 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404520 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404522 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404525 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404528 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404531 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404535 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404540 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404543 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404546 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:40.407396 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404549 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404551 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404554 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404557 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404560 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404562 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404565 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404568 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404570 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404573 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404575 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404578 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404581 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404583 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404586 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404589 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404591 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404594 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404597 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404600 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:40.407909 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404603 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404605 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404608 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404611 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404613 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404616 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404618 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404621 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404623 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404626 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404629 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404631 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404634 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404636 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404639 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404641 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404644 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404646 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404663 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:40.408390 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404666 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404669 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.404675 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404777 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404782 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404785 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404789 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404792 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404795 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404798 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404801 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404803 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404806 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404809 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404812 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404815 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:40.408959 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404817 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404820 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404823 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404825 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404828 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404830 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404833 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404835 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404838 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404840 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404843 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404846 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404848 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404851 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404853 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404857 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404860 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404863 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404865 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404868 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:40.409358 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404870 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404875 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404879 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404882 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404885 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404887 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404890 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404893 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404895 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404897 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404900 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404903 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404906 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404908 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404910 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404913 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404915 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404918 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404921 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:40.409869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404923 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404925 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404928 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404930 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404933 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404936 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404938 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404941 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404943 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404946 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404949 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404952 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404954 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404957 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404959 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404962 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404965 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404967 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404970 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404972 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:40.410330 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404975 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404978 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404981 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404983 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404986 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404989 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404991 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404995 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.404998 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.405001 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.405004 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.405006 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.405009 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:40.405011 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.405016 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:29:40.410829 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.406097 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:29:40.411242 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.410826 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:29:40.412092 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.412079 2569 server.go:1019] "Starting client certificate rotation" Apr 16 16:29:40.412197 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.412179 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:29:40.412229 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.412220 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:29:40.437849 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.437829 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:29:40.442422 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.442395 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:29:40.459551 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.459532 2569 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:29:40.464390 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.464376 2569 log.go:25] "Validated CRI v1 image API" Apr 16 16:29:40.465620 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.465604 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:29:40.469558 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.469536 2569 fs.go:135] Filesystem UUIDs: map[3a901988-77a2-4698-9c1a-f08fe3c43f73:/dev/nvme0n1p3 7709692e-76dc-49db-ad64-5f84446ec22c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 16:29:40.469601 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.469560 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:29:40.470155 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.470139 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:29:40.475257 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.475141 2569 manager.go:217] Machine: {Timestamp:2026-04-16 16:29:40.473255675 +0000 UTC m=+0.417765660 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098168 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2dacc36ce8675e79117d014b3966c7 SystemUUID:ec2dacc3-6ce8-675e-7911-7d014b3966c7 BootID:5899a4b6-544c-4635-8abd-3b337453410b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:10:e3:d5:fc:27 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:10:e3:d5:fc:27 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:00:4a:d4:cc:82 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:29:40.475257 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.475251 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:29:40.475367 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.475334 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:29:40.475723 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.475698 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:29:40.475860 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.475725 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-191.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:29:40.475910 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.475865 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:29:40.475910 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.475874 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:29:40.475910 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.475887 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:29:40.476747 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.476736 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:29:40.478058 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.478048 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:29:40.478172 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.478163 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:29:40.480442 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.480423 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:29:40.480475 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.480447 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:29:40.480475 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.480459 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:29:40.480475 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.480468 2569 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:29:40.480579 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.480477 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:29:40.481614 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.481603 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:29:40.481656 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.481621 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:29:40.485222 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.485196 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:29:40.487175 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.487157 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:29:40.488524 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488502 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488530 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488547 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488557 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488566 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488577 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488587 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488597 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488608 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488626 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:29:40.488643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488650 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:29:40.489046 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488660 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:29:40.489046 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.488974 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zh88x" Apr 16 16:29:40.489519 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.489509 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:29:40.489553 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.489525 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:29:40.491406 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.491385 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:29:40.491406 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.491393 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-191.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:29:40.492870 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.492855 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-191.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:29:40.493136 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.493123 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:29:40.493172 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.493164 2569 server.go:1295] "Started kubelet" Apr 16 16:29:40.493244 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.493220 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:29:40.493339 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.493298 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:29:40.493393 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.493365 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:29:40.494014 ip-10-0-132-191 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:29:40.494541 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.494408 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:29:40.494608 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.494531 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zh88x" Apr 16 16:29:40.495993 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.495977 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:29:40.501137 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.501111 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:29:40.501637 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.501610 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:29:40.502780 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.502759 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:29:40.502856 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.502831 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:29:40.502856 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.502841 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:29:40.502960 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.502942 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:40.503027 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.502981 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:29:40.503027 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.502989 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:29:40.503684 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.503665 2569 factory.go:55] Registering systemd factory Apr 16 16:29:40.503789 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.503739 2569 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:29:40.504030 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.504002 2569 factory.go:153] Registering CRI-O factory Apr 16 16:29:40.504030 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.504017 2569 factory.go:223] Registration of the crio container factory successfully Apr 16 16:29:40.504171 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.504077 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:29:40.504171 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.504101 2569 factory.go:103] Registering Raw factory Apr 16 16:29:40.504171 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.504118 2569 manager.go:1196] Started watching for new ooms in manager Apr 16 16:29:40.504555 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.504527 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:29:40.504618 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.504591 2569 manager.go:319] Starting recovery of all containers Apr 16 16:29:40.509914 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.509895 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:40.512285 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.512140 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-191.ec2.internal\" not found" node="ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.515243 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.515222 2569 manager.go:324] Recovery completed Apr 16 16:29:40.519559 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.519546 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:40.521662 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.521647 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:40.521721 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.521673 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:40.521721 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.521687 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:40.522136 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.522121 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:29:40.522136 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.522133 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:29:40.522260 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.522151 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:29:40.524348 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.524334 2569 policy_none.go:49] "None policy: Start" Apr 16 16:29:40.524348 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.524351 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:29:40.524456 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.524361 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:29:40.571052 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.571035 2569 manager.go:341] "Starting Device Plugin manager" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.571084 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.571099 2569 server.go:85] "Starting device plugin registration server" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.571406 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.571417 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.571539 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.571627 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.571637 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.572147 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:29:40.574696 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.572182 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:40.598100 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.598065 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:29:40.599230 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.599213 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:29:40.599319 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.599253 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:29:40.599319 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.599277 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:29:40.599319 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.599286 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:29:40.599463 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.599327 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:29:40.603051 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.603032 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:40.671849 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.671763 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:40.672780 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.672763 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:40.672890 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.672798 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:40.672890 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.672813 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:40.672890 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.672843 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.681682 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.681647 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.681682 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.681683 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-191.ec2.internal\": node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:40.698543 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.698520 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:40.699499 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.699474 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal"] Apr 16 16:29:40.699598 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.699537 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:40.700314 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.700297 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:40.700378 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.700326 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:40.700378 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.700341 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:40.701882 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.701869 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:40.702017 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.702005 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.702056 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.702030 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:40.702533 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.702515 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:40.702628 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.702515 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:40.702628 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.702584 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:40.702628 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.702613 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:40.702628 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.702544 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:40.702812 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.702634 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:40.704601 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.704586 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.704685 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.704610 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:40.705202 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.705189 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:40.705284 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.705218 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:40.705284 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.705233 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:40.726276 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.726249 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-191.ec2.internal\" not found" node="ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.730733 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.730717 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-191.ec2.internal\" not found" node="ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.799035 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.799002 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:40.804365 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.804347 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef362994efc56c5a4f05cd0c7122fff2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal\" (UID: \"ef362994efc56c5a4f05cd0c7122fff2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.804447 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.804373 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef362994efc56c5a4f05cd0c7122fff2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal\" (UID: \"ef362994efc56c5a4f05cd0c7122fff2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.804447 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.804392 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cbb5555364131c93c767ef634af82b6a-config\") pod \"kube-apiserver-proxy-ip-10-0-132-191.ec2.internal\" (UID: \"cbb5555364131c93c767ef634af82b6a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.900117 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:40.900068 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:40.905545 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.905523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cbb5555364131c93c767ef634af82b6a-config\") pod \"kube-apiserver-proxy-ip-10-0-132-191.ec2.internal\" (UID: \"cbb5555364131c93c767ef634af82b6a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.905648 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.905556 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef362994efc56c5a4f05cd0c7122fff2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal\" (UID: \"ef362994efc56c5a4f05cd0c7122fff2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.905648 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.905585 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef362994efc56c5a4f05cd0c7122fff2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal\" (UID: \"ef362994efc56c5a4f05cd0c7122fff2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.905648 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.905624 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cbb5555364131c93c767ef634af82b6a-config\") pod \"kube-apiserver-proxy-ip-10-0-132-191.ec2.internal\" (UID: \"cbb5555364131c93c767ef634af82b6a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.905648 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.905641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef362994efc56c5a4f05cd0c7122fff2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal\" (UID: \"ef362994efc56c5a4f05cd0c7122fff2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:40.905790 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:40.905627 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef362994efc56c5a4f05cd0c7122fff2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal\" (UID: \"ef362994efc56c5a4f05cd0c7122fff2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:41.001016 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:41.000937 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:41.030393 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.030367 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:41.033983 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.033965 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" Apr 16 16:29:41.101822 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:41.101781 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:41.202227 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:41.202188 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:41.302610 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:41.302543 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:41.402990 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:41.402959 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:41.411307 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.411285 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:29:41.411505 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.411484 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:29:41.411552 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.411485 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:29:41.470745 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:41.470708 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb5555364131c93c767ef634af82b6a.slice/crio-43b2da352f4c477977716f846025989332b50f181398f92f730191bbe379c639 WatchSource:0}: Error finding container 43b2da352f4c477977716f846025989332b50f181398f92f730191bbe379c639: Status 404 returned error can't find the container with id 43b2da352f4c477977716f846025989332b50f181398f92f730191bbe379c639 Apr 16 16:29:41.471236 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:41.471217 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef362994efc56c5a4f05cd0c7122fff2.slice/crio-26c831acbcbad6d025d787819abce5c69e5da25055a4de79fd83beafbd3b6643 WatchSource:0}: Error finding container 26c831acbcbad6d025d787819abce5c69e5da25055a4de79fd83beafbd3b6643: Status 404 returned error can't find the container with id 26c831acbcbad6d025d787819abce5c69e5da25055a4de79fd83beafbd3b6643 Apr 16 16:29:41.474961 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.474945 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:29:41.496633 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.496595 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:24:40 +0000 UTC" deadline="2027-09-09 21:57:27.493394315 +0000 UTC" Apr 16 16:29:41.496633 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.496629 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12269h27m45.996768107s" Apr 16 16:29:41.501957 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.501940 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:29:41.503562 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:41.503548 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-191.ec2.internal\" not found" Apr 16 16:29:41.509683 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.509664 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:29:41.534418 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.534395 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wlpxz" Apr 16 16:29:41.541476 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.541456 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wlpxz" Apr 16 16:29:41.552927 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.552872 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:41.601967 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.601947 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" Apr 16 16:29:41.602892 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.602823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" event={"ID":"cbb5555364131c93c767ef634af82b6a","Type":"ContainerStarted","Data":"43b2da352f4c477977716f846025989332b50f181398f92f730191bbe379c639"} Apr 16 16:29:41.603787 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.603768 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" event={"ID":"ef362994efc56c5a4f05cd0c7122fff2","Type":"ContainerStarted","Data":"26c831acbcbad6d025d787819abce5c69e5da25055a4de79fd83beafbd3b6643"} Apr 16 16:29:41.614286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.614264 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:29:41.616038 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.616025 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" Apr 16 16:29:41.623484 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.623465 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:29:41.731395 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:41.731284 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:42.283093 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.283053 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:42.378026 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.377997 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:42.482155 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.482121 2569 apiserver.go:52] "Watching apiserver" Apr 16 16:29:42.487647 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.487624 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:29:42.487983 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.487962 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-smsng","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd","openshift-cluster-node-tuning-operator/tuned-8nd6p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal","openshift-multus/multus-additional-cni-plugins-f86f6","openshift-multus/multus-q5298","openshift-network-diagnostics/network-check-target-9jq9v","openshift-network-operator/iptables-alerter-7xm7g","kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal","openshift-dns/node-resolver-zcdxw","openshift-image-registry/node-ca-r42bt","openshift-multus/network-metrics-daemon-ff4ns","openshift-ovn-kubernetes/ovnkube-node-b42xz"] Apr 16 16:29:42.490775 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.490753 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.491979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.491951 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.492846 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.492829 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:29:42.493108 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.493091 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wpjpl\"" Apr 16 16:29:42.493193 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.493094 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:29:42.493269 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.493251 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.494312 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.494291 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:29:42.494414 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.494356 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:29:42.494513 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.494490 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-82746\"" Apr 16 16:29:42.494595 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.494503 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:29:42.495348 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.495332 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:29:42.495564 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.495547 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:29:42.495812 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.495781 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gl475\"" Apr 16 16:29:42.495942 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.495921 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q5298" Apr 16 16:29:42.497214 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.497197 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.497844 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.497816 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:29:42.498166 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.498147 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:29:42.498336 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.498202 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:29:42.498498 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.498454 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:42.498570 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.498509 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:29:42.498570 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.498553 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.498680 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.498633 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rvtq8\"" Apr 16 16:29:42.498680 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:42.498607 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:42.499260 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.498928 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:29:42.499260 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.499199 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:29:42.499595 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.499448 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qrl64\"" Apr 16 16:29:42.499755 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.499738 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-smsng" Apr 16 16:29:42.500804 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.500595 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:29:42.500804 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.500662 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:29:42.500804 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.500684 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:29:42.500804 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.500689 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-4wxsw\"" Apr 16 16:29:42.501074 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.501054 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.502103 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.502081 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:29:42.502461 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.502402 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cqrjn\"" Apr 16 16:29:42.503516 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.503023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:29:42.503655 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.503634 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5p66s\"" Apr 16 16:29:42.504181 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.503965 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:29:42.504487 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.504348 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:42.506508 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:42.504660 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:42.506508 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.506210 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:29:42.506508 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.506356 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:29:42.507305 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.507288 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.509410 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.509391 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:29:42.509576 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.509509 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:29:42.509643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.509619 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:29:42.509787 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.509730 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:29:42.509787 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.509780 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:29:42.510097 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.510079 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-w97np\"" Apr 16 16:29:42.510207 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.510165 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:29:42.514914 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.514893 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pj9m\" (UniqueName: \"kubernetes.io/projected/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-kube-api-access-8pj9m\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515016 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.514929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.515016 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.514949 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-os-release\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515016 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.514973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c700cbcd-8214-4f4c-b770-1c0db784bc7b-iptables-alerter-script\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.515174 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515020 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqkc\" (UniqueName: \"kubernetes.io/projected/30ff43b1-927c-4bb7-9b93-70440addb1ed-kube-api-access-4cqkc\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.515174 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515044 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-systemd\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515174 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515089 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.515174 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:42.515174 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-socket-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.515423 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515183 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-modprobe-d\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515423 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-cni-binary-copy\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515423 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515227 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-cni-bin\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515423 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515250 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-tmp-dir\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.515423 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515270 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-k8s-cni-cncf-io\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515423 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515335 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-etc-kubernetes\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515423 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515374 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysctl-d\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515423 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515400 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-var-lib-kubelet\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515424 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-tuned\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515520 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-netns\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515552 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-hostroot\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515575 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-hosts-file\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-run\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-kubelet\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515667 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4240101a-1b9a-426e-bf0c-bc8b7b372154-host\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5bc2\" (UniqueName: \"kubernetes.io/projected/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-kube-api-access-c5bc2\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515714 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-tmp\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515734 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-system-cni-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515753 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-cnibin\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-host\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515814 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d458bdca-23bd-4bb6-b0ec-a3050b306786-agent-certs\") pod \"konnectivity-agent-smsng\" (UID: \"d458bdca-23bd-4bb6-b0ec-a3050b306786\") " pod="kube-system/konnectivity-agent-smsng" Apr 16 16:29:42.515838 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-registration-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-device-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515890 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d458bdca-23bd-4bb6-b0ec-a3050b306786-konnectivity-ca\") pod \"konnectivity-agent-smsng\" (UID: \"d458bdca-23bd-4bb6-b0ec-a3050b306786\") " pod="kube-system/konnectivity-agent-smsng" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515960 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48l46\" (UniqueName: \"kubernetes.io/projected/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-kube-api-access-48l46\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.515988 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-cni-binary-copy\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516033 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-cni-multus\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516076 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfbn\" (UniqueName: \"kubernetes.io/projected/1666c169-2943-4cf3-8a4a-51fc0345056b-kube-api-access-sbfbn\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n99kf\" (UniqueName: \"kubernetes.io/projected/c700cbcd-8214-4f4c-b770-1c0db784bc7b-kube-api-access-n99kf\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516174 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-etc-selinux\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysconfig\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516279 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-kubernetes\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysctl-conf\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516333 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-system-cni-dir\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-multus-certs\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.516495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516401 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlgv8\" (UniqueName: \"kubernetes.io/projected/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-kube-api-access-wlgv8\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4240101a-1b9a-426e-bf0c-bc8b7b372154-serviceca\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrptt\" (UniqueName: \"kubernetes.io/projected/4240101a-1b9a-426e-bf0c-bc8b7b372154-kube-api-access-hrptt\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-sys\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-sys-fs\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-os-release\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516576 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-socket-dir-parent\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-conf-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516639 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-daemon-config\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c700cbcd-8214-4f4c-b770-1c0db784bc7b-host-slash\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516691 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-lib-modules\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516708 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-cni-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-cnibin\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.517346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.516737 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.542051 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.542019 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:24:41 +0000 UTC" deadline="2027-10-09 18:06:19.416588776 +0000 UTC" Apr 16 16:29:42.542051 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.542051 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12985h36m36.874541701s" Apr 16 16:29:42.604253 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.604229 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:29:42.616997 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.616974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-etc-kubernetes\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.617152 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.617152 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617048 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-ovn\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.617152 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.617152 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617102 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-etc-kubernetes\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.617303 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysctl-d\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.617303 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617291 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysctl-d\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.617625 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617603 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-var-lib-kubelet\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.617751 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617637 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-tuned\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.617751 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-var-lib-kubelet\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.617751 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-netns\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.617751 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-hostroot\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.617751 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-kubelet\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.617751 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovnkube-script-lib\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-hostroot\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-hosts-file\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-netns\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617954 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-hosts-file\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.617955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-run\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-kubelet\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618007 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-run\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618033 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4240101a-1b9a-426e-bf0c-bc8b7b372154-host\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5bc2\" (UniqueName: \"kubernetes.io/projected/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-kube-api-access-c5bc2\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:42.618099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-slash\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.618534 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618116 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-node-log\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.618534 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618125 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4240101a-1b9a-426e-bf0c-bc8b7b372154-host\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.618534 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618078 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-kubelet\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.618841 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618556 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovn-node-metrics-cert\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.618986 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.618931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-tmp\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.619067 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-system-cni-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.619167 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-system-cni-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.619230 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-cnibin\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.619276 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-systemd-units\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.619327 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619303 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-host\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.619375 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d458bdca-23bd-4bb6-b0ec-a3050b306786-agent-certs\") pod \"konnectivity-agent-smsng\" (UID: \"d458bdca-23bd-4bb6-b0ec-a3050b306786\") " pod="kube-system/konnectivity-agent-smsng" Apr 16 16:29:42.619443 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-registration-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.619443 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-device-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.619541 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-cni-bin\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.619541 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d458bdca-23bd-4bb6-b0ec-a3050b306786-konnectivity-ca\") pod \"konnectivity-agent-smsng\" (UID: \"d458bdca-23bd-4bb6-b0ec-a3050b306786\") " pod="kube-system/konnectivity-agent-smsng" Apr 16 16:29:42.619541 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.619672 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48l46\" (UniqueName: \"kubernetes.io/projected/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-kube-api-access-48l46\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.619672 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-cni-binary-copy\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.619672 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619642 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-cni-multus\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.619803 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-var-lib-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.619803 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-etc-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.619945 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619925 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-cnibin\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.620008 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.619932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfbn\" (UniqueName: \"kubernetes.io/projected/1666c169-2943-4cf3-8a4a-51fc0345056b-kube-api-access-sbfbn\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.620109 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n99kf\" (UniqueName: \"kubernetes.io/projected/c700cbcd-8214-4f4c-b770-1c0db784bc7b-kube-api-access-n99kf\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.620199 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-etc-selinux\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.620285 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:42.620378 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620367 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysconfig\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.620496 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-kubernetes\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.620599 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620587 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysctl-conf\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.620687 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-system-cni-dir\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.620782 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-multus-certs\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.620873 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlgv8\" (UniqueName: \"kubernetes.io/projected/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-kube-api-access-wlgv8\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.621013 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4240101a-1b9a-426e-bf0c-bc8b7b372154-serviceca\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.621117 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrptt\" (UniqueName: \"kubernetes.io/projected/4240101a-1b9a-426e-bf0c-bc8b7b372154-kube-api-access-hrptt\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.621195 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-sys\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.621292 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-sys-fs\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.621380 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-systemd\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.621588 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621563 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-multus-certs\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.621658 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621574 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-etc-selinux\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.621713 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621662 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysconfig\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.621769 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621713 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-system-cni-dir\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.621769 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-kubernetes\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.621860 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621836 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-sysctl-conf\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.621980 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.621956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-os-release\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.622045 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-socket-dir-parent\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.622102 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-conf-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.622102 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622088 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-daemon-config\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.622196 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-registration-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.622260 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c700cbcd-8214-4f4c-b770-1c0db784bc7b-host-slash\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.622260 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622243 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-run-netns\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.622372 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-lib-modules\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622413 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4240101a-1b9a-426e-bf0c-bc8b7b372154-serviceca\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-lib-modules\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622622 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-sys\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-sys-fs\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-daemon-config\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622764 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-device-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d458bdca-23bd-4bb6-b0ec-a3050b306786-konnectivity-ca\") pod \"konnectivity-agent-smsng\" (UID: \"d458bdca-23bd-4bb6-b0ec-a3050b306786\") " pod="kube-system/konnectivity-agent-smsng" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622857 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-os-release\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-socket-dir-parent\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.620004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-host\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.622963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-tuned\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-cni-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-cnibin\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.623286 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623352 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-conf-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-env-overrides\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m64x\" (UniqueName: \"kubernetes.io/projected/18ea2b0b-1348-4827-969f-18c4a33a0dc8-kube-api-access-5m64x\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pj9m\" (UniqueName: \"kubernetes.io/projected/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-kube-api-access-8pj9m\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623566 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623630 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-multus-cni-dir\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-os-release\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623668 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c700cbcd-8214-4f4c-b770-1c0db784bc7b-iptables-alerter-script\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623713 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-tmp\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-os-release\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623807 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-cni-multus\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623892 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqkc\" (UniqueName: \"kubernetes.io/projected/30ff43b1-927c-4bb7-9b93-70440addb1ed-kube-api-access-4cqkc\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.623979 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c700cbcd-8214-4f4c-b770-1c0db784bc7b-host-slash\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.623986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-log-socket\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624021 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-cnibin\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-cni-netd\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624116 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-systemd\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624269 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624302 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovnkube-config\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-socket-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.624581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624568 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c700cbcd-8214-4f4c-b770-1c0db784bc7b-iptables-alerter-script\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.625037 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-modprobe-d\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.625037 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624563 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-modprobe-d\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.625037 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.625037 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624717 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d458bdca-23bd-4bb6-b0ec-a3050b306786-agent-certs\") pod \"konnectivity-agent-smsng\" (UID: \"d458bdca-23bd-4bb6-b0ec-a3050b306786\") " pod="kube-system/konnectivity-agent-smsng" Apr 16 16:29:42.625037 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:42.624750 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:42.625037 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624847 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-cni-binary-copy\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.625037 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1666c169-2943-4cf3-8a4a-51fc0345056b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.625316 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624883 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-etc-systemd\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.625316 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.624893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/30ff43b1-927c-4bb7-9b93-70440addb1ed-socket-dir\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.625316 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.625105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-cni-bin\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.625316 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:42.625213 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs podName:3fbad60e-9cf1-43dd-abb0-8d7c1caab371 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:43.125170282 +0000 UTC m=+3.069680266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs") pod "network-metrics-daemon-ff4ns" (UID: "3fbad60e-9cf1-43dd-abb0-8d7c1caab371") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:42.625316 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.625216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-var-lib-cni-bin\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.625316 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.625232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-tmp-dir\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.625316 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.625277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-k8s-cni-cncf-io\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.625643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.625363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-host-run-k8s-cni-cncf-io\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.626028 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.625879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-tmp-dir\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.627620 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:42.627598 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:42.627711 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:42.627626 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:42.627711 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:42.627638 2569 projected.go:194] Error preparing data for projected volume kube-api-access-2xg8r for pod openshift-network-diagnostics/network-check-target-9jq9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:42.627816 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:42.627747 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r podName:f81e14b6-a4d4-417f-9556-bdceafdafe3a nodeName:}" failed. No retries permitted until 2026-04-16 16:29:43.127732265 +0000 UTC m=+3.072242256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2xg8r" (UniqueName: "kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r") pod "network-check-target-9jq9v" (UID: "f81e14b6-a4d4-417f-9556-bdceafdafe3a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:42.629571 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.629020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5bc2\" (UniqueName: \"kubernetes.io/projected/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-kube-api-access-c5bc2\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:42.630626 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.630602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrptt\" (UniqueName: \"kubernetes.io/projected/4240101a-1b9a-426e-bf0c-bc8b7b372154-kube-api-access-hrptt\") pod \"node-ca-r42bt\" (UID: \"4240101a-1b9a-426e-bf0c-bc8b7b372154\") " pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.630702 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.630685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlgv8\" (UniqueName: \"kubernetes.io/projected/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-kube-api-access-wlgv8\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.630771 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.630708 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n99kf\" (UniqueName: \"kubernetes.io/projected/c700cbcd-8214-4f4c-b770-1c0db784bc7b-kube-api-access-n99kf\") pod \"iptables-alerter-7xm7g\" (UID: \"c700cbcd-8214-4f4c-b770-1c0db784bc7b\") " pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.632136 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.632116 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pj9m\" (UniqueName: \"kubernetes.io/projected/91b79665-6a07-4b37-bdfb-a4cd7ab285a2-kube-api-access-8pj9m\") pod \"tuned-8nd6p\" (UID: \"91b79665-6a07-4b37-bdfb-a4cd7ab285a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.632233 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.632156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqkc\" (UniqueName: \"kubernetes.io/projected/30ff43b1-927c-4bb7-9b93-70440addb1ed-kube-api-access-4cqkc\") pod \"aws-ebs-csi-driver-node-zgpkd\" (UID: \"30ff43b1-927c-4bb7-9b93-70440addb1ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.633028 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.632898 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca349f5f-0121-41f2-99a5-676a6e9d7f2c-cni-binary-copy\") pod \"multus-q5298\" (UID: \"ca349f5f-0121-41f2-99a5-676a6e9d7f2c\") " pod="openshift-multus/multus-q5298" Apr 16 16:29:42.633028 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.632989 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-cni-binary-copy\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.633571 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.633521 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.633571 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.633524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1666c169-2943-4cf3-8a4a-51fc0345056b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.634626 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.634601 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfbn\" (UniqueName: \"kubernetes.io/projected/1666c169-2943-4cf3-8a4a-51fc0345056b-kube-api-access-sbfbn\") pod \"multus-additional-cni-plugins-f86f6\" (UID: \"1666c169-2943-4cf3-8a4a-51fc0345056b\") " pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.635048 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.635029 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48l46\" (UniqueName: \"kubernetes.io/projected/d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c-kube-api-access-48l46\") pod \"node-resolver-zcdxw\" (UID: \"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c\") " pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.726322 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-run-netns\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-env-overrides\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m64x\" (UniqueName: \"kubernetes.io/projected/18ea2b0b-1348-4827-969f-18c4a33a0dc8-kube-api-access-5m64x\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726388 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-log-socket\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-cni-netd\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726405 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-run-netns\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726481 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-cni-netd\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726501 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-log-socket\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726513 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.726589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726562 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovnkube-config\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726638 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-ovn\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726700 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726723 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-kubelet\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovnkube-script-lib\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726752 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-ovn\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726757 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-slash\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-node-log\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726813 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-kubelet\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726820 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovn-node-metrics-cert\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-systemd-units\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726885 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-cni-bin\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-env-overrides\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726901 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-systemd-units\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-var-lib-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-slash\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-etc-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726962 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-var-lib-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-node-log\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726948 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-host-cni-bin\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.726999 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-etc-openvswitch\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.727004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-systemd\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.727055 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/18ea2b0b-1348-4827-969f-18c4a33a0dc8-run-systemd\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.727154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovnkube-config\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.727719 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.727262 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovnkube-script-lib\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.729443 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.729412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18ea2b0b-1348-4827-969f-18c4a33a0dc8-ovn-node-metrics-cert\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.734345 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.734321 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m64x\" (UniqueName: \"kubernetes.io/projected/18ea2b0b-1348-4827-969f-18c4a33a0dc8-kube-api-access-5m64x\") pod \"ovnkube-node-b42xz\" (UID: \"18ea2b0b-1348-4827-969f-18c4a33a0dc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:42.804211 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.804141 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zcdxw" Apr 16 16:29:42.813063 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.813043 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" Apr 16 16:29:42.818730 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.818696 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" Apr 16 16:29:42.827267 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.827246 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q5298" Apr 16 16:29:42.833320 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.833301 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f86f6" Apr 16 16:29:42.839825 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.839807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7xm7g" Apr 16 16:29:42.847337 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.847321 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-smsng" Apr 16 16:29:42.853858 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.853837 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r42bt" Apr 16 16:29:42.860375 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:42.860360 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:29:43.010692 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.010646 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4240101a_1b9a_426e_bf0c_bc8b7b372154.slice/crio-ae6d89ad2b986b321520cb149b4fa3f8d710957cfcc13521ca093a712df38f56 WatchSource:0}: Error finding container ae6d89ad2b986b321520cb149b4fa3f8d710957cfcc13521ca093a712df38f56: Status 404 returned error can't find the container with id ae6d89ad2b986b321520cb149b4fa3f8d710957cfcc13521ca093a712df38f56 Apr 16 16:29:43.011824 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.011799 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ea2b0b_1348_4827_969f_18c4a33a0dc8.slice/crio-f53c0d396ef1599bf97ac49058da603eaf724054ccd1797992cd302a41b52400 WatchSource:0}: Error finding container f53c0d396ef1599bf97ac49058da603eaf724054ccd1797992cd302a41b52400: Status 404 returned error can't find the container with id f53c0d396ef1599bf97ac49058da603eaf724054ccd1797992cd302a41b52400 Apr 16 16:29:43.015021 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.015000 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b79665_6a07_4b37_bdfb_a4cd7ab285a2.slice/crio-7d1e6531d4ec18953313515563593fa6cd56b6a756a42675026d951552795c97 WatchSource:0}: Error finding container 7d1e6531d4ec18953313515563593fa6cd56b6a756a42675026d951552795c97: Status 404 returned error can't find the container with id 7d1e6531d4ec18953313515563593fa6cd56b6a756a42675026d951552795c97 Apr 16 16:29:43.015699 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.015675 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc700cbcd_8214_4f4c_b770_1c0db784bc7b.slice/crio-b3ed09dc2fd37a3d6febade49deed1832c7ac7019fe9b98d4be20ee072d8119b WatchSource:0}: Error finding container b3ed09dc2fd37a3d6febade49deed1832c7ac7019fe9b98d4be20ee072d8119b: Status 404 returned error can't find the container with id b3ed09dc2fd37a3d6febade49deed1832c7ac7019fe9b98d4be20ee072d8119b Apr 16 16:29:43.016463 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.016426 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30ff43b1_927c_4bb7_9b93_70440addb1ed.slice/crio-181394fac949d7ef733702f7d5dce806828a873286aa3aa0a804ddeaf6f4b443 WatchSource:0}: Error finding container 181394fac949d7ef733702f7d5dce806828a873286aa3aa0a804ddeaf6f4b443: Status 404 returned error can't find the container with id 181394fac949d7ef733702f7d5dce806828a873286aa3aa0a804ddeaf6f4b443 Apr 16 16:29:43.017206 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.017181 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd458bdca_23bd_4bb6_b0ec_a3050b306786.slice/crio-4c9f53ea01820d138fc6b0a113eb4c2f2b009dc93e0f1c912cb560548a9569ce WatchSource:0}: Error finding container 4c9f53ea01820d138fc6b0a113eb4c2f2b009dc93e0f1c912cb560548a9569ce: Status 404 returned error can't find the container with id 4c9f53ea01820d138fc6b0a113eb4c2f2b009dc93e0f1c912cb560548a9569ce Apr 16 16:29:43.018446 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.018411 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1666c169_2943_4cf3_8a4a_51fc0345056b.slice/crio-6ba773d1bc32341a8a23e951a7b8f973489a121510e251ec5357053c5b188492 WatchSource:0}: Error finding container 6ba773d1bc32341a8a23e951a7b8f973489a121510e251ec5357053c5b188492: Status 404 returned error can't find the container with id 6ba773d1bc32341a8a23e951a7b8f973489a121510e251ec5357053c5b188492 Apr 16 16:29:43.018869 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.018845 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2c2bb39_a1e3_4a2f_b48b_4c18cd10ce8c.slice/crio-50859b2163e31f9253de6b5ce411fde3157e6edc6a4674c4da4f25cb21c4dfd3 WatchSource:0}: Error finding container 50859b2163e31f9253de6b5ce411fde3157e6edc6a4674c4da4f25cb21c4dfd3: Status 404 returned error can't find the container with id 50859b2163e31f9253de6b5ce411fde3157e6edc6a4674c4da4f25cb21c4dfd3 Apr 16 16:29:43.019795 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:29:43.019764 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca349f5f_0121_41f2_99a5_676a6e9d7f2c.slice/crio-9336787a9d43aa45630f39edbd03a529a6947281d39bd0ff22e31d9bba1ce513 WatchSource:0}: Error finding container 9336787a9d43aa45630f39edbd03a529a6947281d39bd0ff22e31d9bba1ce513: Status 404 returned error can't find the container with id 9336787a9d43aa45630f39edbd03a529a6947281d39bd0ff22e31d9bba1ce513 Apr 16 16:29:43.130708 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.130549 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:43.130879 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:43.130732 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:43.130879 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.130744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:43.130879 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:43.130754 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:43.130879 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:43.130767 2569 projected.go:194] Error preparing data for projected volume kube-api-access-2xg8r for pod openshift-network-diagnostics/network-check-target-9jq9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:43.130879 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:43.130823 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r podName:f81e14b6-a4d4-417f-9556-bdceafdafe3a nodeName:}" failed. No retries permitted until 2026-04-16 16:29:44.130804374 +0000 UTC m=+4.075314347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2xg8r" (UniqueName: "kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r") pod "network-check-target-9jq9v" (UID: "f81e14b6-a4d4-417f-9556-bdceafdafe3a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:43.130879 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:43.130840 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:43.130879 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:43.130882 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs podName:3fbad60e-9cf1-43dd-abb0-8d7c1caab371 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:44.130870773 +0000 UTC m=+4.075380744 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs") pod "network-metrics-daemon-ff4ns" (UID: "3fbad60e-9cf1-43dd-abb0-8d7c1caab371") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:43.546026 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.545987 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:24:41 +0000 UTC" deadline="2027-10-01 02:36:17.733660039 +0000 UTC" Apr 16 16:29:43.546026 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.546022 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12778h6m34.18764112s" Apr 16 16:29:43.617822 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.617740 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" event={"ID":"91b79665-6a07-4b37-bdfb-a4cd7ab285a2","Type":"ContainerStarted","Data":"7d1e6531d4ec18953313515563593fa6cd56b6a756a42675026d951552795c97"} Apr 16 16:29:43.631654 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.631599 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"f53c0d396ef1599bf97ac49058da603eaf724054ccd1797992cd302a41b52400"} Apr 16 16:29:43.639817 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.639787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r42bt" event={"ID":"4240101a-1b9a-426e-bf0c-bc8b7b372154","Type":"ContainerStarted","Data":"ae6d89ad2b986b321520cb149b4fa3f8d710957cfcc13521ca093a712df38f56"} Apr 16 16:29:43.644139 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.644111 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" event={"ID":"cbb5555364131c93c767ef634af82b6a","Type":"ContainerStarted","Data":"e6b4f02d1820aa176c833c11bf5bc7c29a53ff0380f1604d937de19455b05cc8"} Apr 16 16:29:43.647106 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.647081 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zcdxw" event={"ID":"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c","Type":"ContainerStarted","Data":"50859b2163e31f9253de6b5ce411fde3157e6edc6a4674c4da4f25cb21c4dfd3"} Apr 16 16:29:43.655864 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.655839 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-smsng" event={"ID":"d458bdca-23bd-4bb6-b0ec-a3050b306786","Type":"ContainerStarted","Data":"4c9f53ea01820d138fc6b0a113eb4c2f2b009dc93e0f1c912cb560548a9569ce"} Apr 16 16:29:43.659908 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.659884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerStarted","Data":"6ba773d1bc32341a8a23e951a7b8f973489a121510e251ec5357053c5b188492"} Apr 16 16:29:43.667256 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.667233 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q5298" event={"ID":"ca349f5f-0121-41f2-99a5-676a6e9d7f2c","Type":"ContainerStarted","Data":"9336787a9d43aa45630f39edbd03a529a6947281d39bd0ff22e31d9bba1ce513"} Apr 16 16:29:43.671724 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.671699 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" event={"ID":"30ff43b1-927c-4bb7-9b93-70440addb1ed","Type":"ContainerStarted","Data":"181394fac949d7ef733702f7d5dce806828a873286aa3aa0a804ddeaf6f4b443"} Apr 16 16:29:43.678954 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:43.678929 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7xm7g" event={"ID":"c700cbcd-8214-4f4c-b770-1c0db784bc7b","Type":"ContainerStarted","Data":"b3ed09dc2fd37a3d6febade49deed1832c7ac7019fe9b98d4be20ee072d8119b"} Apr 16 16:29:44.141259 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:44.140953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:44.141259 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:44.141029 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:44.141259 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:44.141111 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:44.141259 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:44.141159 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:44.141259 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:44.141174 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:44.141259 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:44.141181 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs podName:3fbad60e-9cf1-43dd-abb0-8d7c1caab371 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:46.141162709 +0000 UTC m=+6.085672694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs") pod "network-metrics-daemon-ff4ns" (UID: "3fbad60e-9cf1-43dd-abb0-8d7c1caab371") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:44.141259 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:44.141186 2569 projected.go:194] Error preparing data for projected volume kube-api-access-2xg8r for pod openshift-network-diagnostics/network-check-target-9jq9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:44.141259 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:44.141235 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r podName:f81e14b6-a4d4-417f-9556-bdceafdafe3a nodeName:}" failed. No retries permitted until 2026-04-16 16:29:46.141219707 +0000 UTC m=+6.085729697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2xg8r" (UniqueName: "kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r") pod "network-check-target-9jq9v" (UID: "f81e14b6-a4d4-417f-9556-bdceafdafe3a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:44.600328 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:44.600275 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:44.600783 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:44.600396 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:44.600783 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:44.600480 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:44.600783 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:44.600566 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:44.712185 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:44.711556 2569 generic.go:358] "Generic (PLEG): container finished" podID="ef362994efc56c5a4f05cd0c7122fff2" containerID="309a49529bac221b5bd1222b82af66bce7f7f9d76b1179193c886ab7363ee5aa" exitCode=0 Apr 16 16:29:44.712185 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:44.711739 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" event={"ID":"ef362994efc56c5a4f05cd0c7122fff2","Type":"ContainerDied","Data":"309a49529bac221b5bd1222b82af66bce7f7f9d76b1179193c886ab7363ee5aa"} Apr 16 16:29:44.726852 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:44.725050 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-191.ec2.internal" podStartSLOduration=3.725032767 podStartE2EDuration="3.725032767s" podCreationTimestamp="2026-04-16 16:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:29:43.657480993 +0000 UTC m=+3.601990987" watchObservedRunningTime="2026-04-16 16:29:44.725032767 +0000 UTC m=+4.669542759" Apr 16 16:29:45.719857 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:45.719118 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" event={"ID":"ef362994efc56c5a4f05cd0c7122fff2","Type":"ContainerStarted","Data":"41de9c5458a6cc822181a22846a51eef0a85014b7495136e901e1ecb925216bd"} Apr 16 16:29:46.157746 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:46.157707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:46.157917 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:46.157799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:46.157994 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:46.157967 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:46.158045 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:46.158001 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:46.158045 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:46.158020 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:46.158045 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:46.158032 2569 projected.go:194] Error preparing data for projected volume kube-api-access-2xg8r for pod openshift-network-diagnostics/network-check-target-9jq9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:46.158160 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:46.158046 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs podName:3fbad60e-9cf1-43dd-abb0-8d7c1caab371 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:50.158026659 +0000 UTC m=+10.102536643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs") pod "network-metrics-daemon-ff4ns" (UID: "3fbad60e-9cf1-43dd-abb0-8d7c1caab371") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:46.158160 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:46.158088 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r podName:f81e14b6-a4d4-417f-9556-bdceafdafe3a nodeName:}" failed. No retries permitted until 2026-04-16 16:29:50.158070234 +0000 UTC m=+10.102580219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2xg8r" (UniqueName: "kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r") pod "network-check-target-9jq9v" (UID: "f81e14b6-a4d4-417f-9556-bdceafdafe3a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:46.600515 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:46.600263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:46.600515 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:46.600271 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:46.600515 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:46.600379 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:46.600515 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:46.600495 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:48.599778 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:48.599713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:48.600243 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:48.599876 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:48.600243 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:48.599934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:48.600243 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:48.599994 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:50.189848 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:50.189717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:50.189848 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:50.189802 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:50.190383 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:50.189883 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:50.190383 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:50.189936 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:50.190383 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:50.189955 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:50.190383 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:50.189957 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs podName:3fbad60e-9cf1-43dd-abb0-8d7c1caab371 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:58.189936956 +0000 UTC m=+18.134446939 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs") pod "network-metrics-daemon-ff4ns" (UID: "3fbad60e-9cf1-43dd-abb0-8d7c1caab371") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:50.190383 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:50.189968 2569 projected.go:194] Error preparing data for projected volume kube-api-access-2xg8r for pod openshift-network-diagnostics/network-check-target-9jq9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:50.190383 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:50.190020 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r podName:f81e14b6-a4d4-417f-9556-bdceafdafe3a nodeName:}" failed. No retries permitted until 2026-04-16 16:29:58.19000489 +0000 UTC m=+18.134514873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2xg8r" (UniqueName: "kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r") pod "network-check-target-9jq9v" (UID: "f81e14b6-a4d4-417f-9556-bdceafdafe3a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:50.599779 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:50.599646 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:50.600021 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:50.599771 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:50.601241 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:50.601101 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:50.601241 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:50.601207 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:51.417449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.417383 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-191.ec2.internal" podStartSLOduration=10.417365248 podStartE2EDuration="10.417365248s" podCreationTimestamp="2026-04-16 16:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:29:45.734276237 +0000 UTC m=+5.678786228" watchObservedRunningTime="2026-04-16 16:29:51.417365248 +0000 UTC m=+11.361875245" Apr 16 16:29:51.418150 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.418125 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nv9gc"] Apr 16 16:29:51.421116 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.421095 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:51.421224 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:51.421175 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:29:51.499063 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.498990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-dbus\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:51.499063 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.499065 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-kubelet-config\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:51.499253 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.499155 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:51.600193 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.600157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:51.600343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.600218 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-dbus\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:51.600343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.600260 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-kubelet-config\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:51.600343 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:51.600333 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:51.600513 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.600356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-kubelet-config\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:51.600513 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:51.600404 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret podName:bc1da351-41a0-434d-9b7e-bf1cfdc791f4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:52.100386804 +0000 UTC m=+12.044896778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret") pod "global-pull-secret-syncer-nv9gc" (UID: "bc1da351-41a0-434d-9b7e-bf1cfdc791f4") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:51.600513 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:51.600498 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-dbus\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:52.104448 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:52.104402 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:52.104625 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:52.104600 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:52.104698 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:52.104687 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret podName:bc1da351-41a0-434d-9b7e-bf1cfdc791f4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:53.104671159 +0000 UTC m=+13.049181147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret") pod "global-pull-secret-syncer-nv9gc" (UID: "bc1da351-41a0-434d-9b7e-bf1cfdc791f4") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:52.599524 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:52.599489 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:52.599973 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:52.599492 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:52.599973 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:52.599614 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:52.599973 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:52.599706 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:53.112291 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:53.112251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:53.112455 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:53.112399 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:53.112498 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:53.112484 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret podName:bc1da351-41a0-434d-9b7e-bf1cfdc791f4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:55.11246595 +0000 UTC m=+15.056975921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret") pod "global-pull-secret-syncer-nv9gc" (UID: "bc1da351-41a0-434d-9b7e-bf1cfdc791f4") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:53.600397 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:53.600361 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:53.600838 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:53.600494 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:29:54.599808 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:54.599776 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:54.599808 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:54.599805 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:54.599994 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:54.599897 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:54.600058 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:54.600037 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:55.125860 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:55.125817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:55.126296 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:55.125997 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:55.126296 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:55.126082 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret podName:bc1da351-41a0-434d-9b7e-bf1cfdc791f4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:59.12605912 +0000 UTC m=+19.070569096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret") pod "global-pull-secret-syncer-nv9gc" (UID: "bc1da351-41a0-434d-9b7e-bf1cfdc791f4") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:55.600079 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:55.600042 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:55.600249 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:55.600164 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:29:56.600053 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:56.600012 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:56.600475 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:56.600150 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:56.600475 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:56.600240 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:56.600475 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:56.600356 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:57.600240 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:57.600204 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:57.600719 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:57.600333 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:29:58.247720 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:58.247679 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:58.247892 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:58.247752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:58.247892 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:58.247849 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:58.247892 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:58.247875 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:58.248043 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:58.247903 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:58.248043 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:58.247909 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs podName:3fbad60e-9cf1-43dd-abb0-8d7c1caab371 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:14.24789139 +0000 UTC m=+34.192401378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs") pod "network-metrics-daemon-ff4ns" (UID: "3fbad60e-9cf1-43dd-abb0-8d7c1caab371") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:58.248043 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:58.247916 2569 projected.go:194] Error preparing data for projected volume kube-api-access-2xg8r for pod openshift-network-diagnostics/network-check-target-9jq9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:58.248043 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:58.247967 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r podName:f81e14b6-a4d4-417f-9556-bdceafdafe3a nodeName:}" failed. No retries permitted until 2026-04-16 16:30:14.247951276 +0000 UTC m=+34.192461252 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2xg8r" (UniqueName: "kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r") pod "network-check-target-9jq9v" (UID: "f81e14b6-a4d4-417f-9556-bdceafdafe3a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:58.600511 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:58.600426 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:29:58.600918 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:58.600552 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:29:58.600918 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:58.600616 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:29:58.600918 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:58.600753 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:29:59.154314 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:59.154275 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:59.154519 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:59.154460 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:59.154579 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:59.154539 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret podName:bc1da351-41a0-434d-9b7e-bf1cfdc791f4 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:07.154519159 +0000 UTC m=+27.099029153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret") pod "global-pull-secret-syncer-nv9gc" (UID: "bc1da351-41a0-434d-9b7e-bf1cfdc791f4") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:29:59.599815 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:29:59.599735 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:29:59.599951 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:29:59.599841 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:00.600581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.600400 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:00.601112 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.600478 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:00.601112 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:00.600642 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:00.601112 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:00.600747 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:00.745958 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.745908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerStarted","Data":"8b23dbf2f166212d236208d539f012798a7b73bbce3225ed44828a86cc3cc97d"} Apr 16 16:30:00.747414 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.747387 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q5298" event={"ID":"ca349f5f-0121-41f2-99a5-676a6e9d7f2c","Type":"ContainerStarted","Data":"033c5d71f7cf109a9b5359fd9313edd5421aa00e857ad42682c7e53cd0af426a"} Apr 16 16:30:00.748859 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.748833 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" event={"ID":"30ff43b1-927c-4bb7-9b93-70440addb1ed","Type":"ContainerStarted","Data":"653a4b8b0c184674b007c362aaaa8c0d6ef116929bd7256d46ec6d8c01de0e73"} Apr 16 16:30:00.750222 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.750200 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" event={"ID":"91b79665-6a07-4b37-bdfb-a4cd7ab285a2","Type":"ContainerStarted","Data":"dfc65f2ebb1e9a0064487c77e49eb06fac95ba5767c5654e7180cb342e4054a0"} Apr 16 16:30:00.752176 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.752158 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:30:00.752487 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.752466 2569 generic.go:358] "Generic (PLEG): container finished" podID="18ea2b0b-1348-4827-969f-18c4a33a0dc8" containerID="b4a5eaa3c615abb9c7c99ed494b0881c5f546e1a8b3fc59b6e408b9ae27ad248" exitCode=1 Apr 16 16:30:00.752544 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.752511 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"5be049ab378de5beca86b405f1f3a5f87358132f5629bf71acf45d7a8504faaa"} Apr 16 16:30:00.752544 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.752539 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"e6155b67a29381522c27a08cbfa0d9ce42da2f755c0dc3ea7f7750c84ebcf5d9"} Apr 16 16:30:00.752626 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.752552 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerDied","Data":"b4a5eaa3c615abb9c7c99ed494b0881c5f546e1a8b3fc59b6e408b9ae27ad248"} Apr 16 16:30:00.752626 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.752562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"3a8a21c11b0f1db0d5e7b2c93cde94c33f8c73e05c4837c6f0aac432657520f6"} Apr 16 16:30:00.753783 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.753756 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r42bt" event={"ID":"4240101a-1b9a-426e-bf0c-bc8b7b372154","Type":"ContainerStarted","Data":"71facebe1bf8f17bbc631090b8b8cdb3d7cbdf32702326f8ea0709c6a966a1b5"} Apr 16 16:30:00.755314 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.755277 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zcdxw" event={"ID":"d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c","Type":"ContainerStarted","Data":"675d521aa988131b60c9266d9bd51341471af31b96bf8fe25a15800aa74deb05"} Apr 16 16:30:00.756594 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.756563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-smsng" event={"ID":"d458bdca-23bd-4bb6-b0ec-a3050b306786","Type":"ContainerStarted","Data":"dec01514be3e41c3754a493cb4f05fe58d7d77fa7d9f9f74ea539d5df3677a22"} Apr 16 16:30:00.782976 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.782914 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zcdxw" podStartSLOduration=3.70775501 podStartE2EDuration="20.782895181s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.022038223 +0000 UTC m=+2.966548200" lastFinishedPulling="2026-04-16 16:30:00.097178387 +0000 UTC m=+20.041688371" observedRunningTime="2026-04-16 16:30:00.782635723 +0000 UTC m=+20.727145720" watchObservedRunningTime="2026-04-16 16:30:00.782895181 +0000 UTC m=+20.727405175" Apr 16 16:30:00.800665 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.799309 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-smsng" podStartSLOduration=11.893156283 podStartE2EDuration="20.799288802s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.019293835 +0000 UTC m=+2.963803810" lastFinishedPulling="2026-04-16 16:29:51.925426342 +0000 UTC m=+11.869936329" observedRunningTime="2026-04-16 16:30:00.798158122 +0000 UTC m=+20.742668117" watchObservedRunningTime="2026-04-16 16:30:00.799288802 +0000 UTC m=+20.743798840" Apr 16 16:30:00.813984 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.813923 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-r42bt" podStartSLOduration=3.731700891 podStartE2EDuration="20.813901534s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.01303444 +0000 UTC m=+2.957544416" lastFinishedPulling="2026-04-16 16:30:00.095235072 +0000 UTC m=+20.039745059" observedRunningTime="2026-04-16 16:30:00.813254468 +0000 UTC m=+20.757764461" watchObservedRunningTime="2026-04-16 16:30:00.813901534 +0000 UTC m=+20.758411542" Apr 16 16:30:00.854276 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.854221 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8nd6p" podStartSLOduration=3.773544452 podStartE2EDuration="20.854200639s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.016633778 +0000 UTC m=+2.961143749" lastFinishedPulling="2026-04-16 16:30:00.097289959 +0000 UTC m=+20.041799936" observedRunningTime="2026-04-16 16:30:00.853776506 +0000 UTC m=+20.798286512" watchObservedRunningTime="2026-04-16 16:30:00.854200639 +0000 UTC m=+20.798710631" Apr 16 16:30:00.880957 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:00.880899 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q5298" podStartSLOduration=3.79293233 podStartE2EDuration="20.880885431s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.022131592 +0000 UTC m=+2.966641577" lastFinishedPulling="2026-04-16 16:30:00.110084703 +0000 UTC m=+20.054594678" observedRunningTime="2026-04-16 16:30:00.880502854 +0000 UTC m=+20.825012849" watchObservedRunningTime="2026-04-16 16:30:00.880885431 +0000 UTC m=+20.825395424" Apr 16 16:30:01.235215 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.235141 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-smsng" Apr 16 16:30:01.235629 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.235611 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-smsng" Apr 16 16:30:01.599637 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.599607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:01.599838 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:01.599746 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:01.714698 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.714660 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:30:01.759124 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.759092 2569 generic.go:358] "Generic (PLEG): container finished" podID="1666c169-2943-4cf3-8a4a-51fc0345056b" containerID="8b23dbf2f166212d236208d539f012798a7b73bbce3225ed44828a86cc3cc97d" exitCode=0 Apr 16 16:30:01.759304 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.759163 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerDied","Data":"8b23dbf2f166212d236208d539f012798a7b73bbce3225ed44828a86cc3cc97d"} Apr 16 16:30:01.760755 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.760734 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" event={"ID":"30ff43b1-927c-4bb7-9b93-70440addb1ed","Type":"ContainerStarted","Data":"63a2cccc7e25d6976a5784907d394774f892f39dbe2ecae2114822fb7fb91afa"} Apr 16 16:30:01.761988 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.761936 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7xm7g" event={"ID":"c700cbcd-8214-4f4c-b770-1c0db784bc7b","Type":"ContainerStarted","Data":"fecc4b11fbfdf5fda899650c7cb24adb2d2690228eb326745bc494d63899eaa0"} Apr 16 16:30:01.764237 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.764220 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:30:01.764572 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.764550 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"a3a3ca9826e222ab5366e216c9b2cf37076e6915e1bc665901f4cb378bbf4af5"} Apr 16 16:30:01.764634 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.764578 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"8271cad6089ab50f4a25540c4adc629abe8605fcffeb559dae5cd89c621d2fb8"} Apr 16 16:30:01.765290 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.765263 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-smsng" Apr 16 16:30:01.765622 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.765603 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-smsng" Apr 16 16:30:01.791632 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:01.791554 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7xm7g" podStartSLOduration=4.742295856 podStartE2EDuration="21.791541196s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.017462708 +0000 UTC m=+2.961972686" lastFinishedPulling="2026-04-16 16:30:00.066708037 +0000 UTC m=+20.011218026" observedRunningTime="2026-04-16 16:30:01.791209998 +0000 UTC m=+21.735719992" watchObservedRunningTime="2026-04-16 16:30:01.791541196 +0000 UTC m=+21.736051188" Apr 16 16:30:02.585252 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:02.585152 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:30:01.71468193Z","UUID":"a0390123-9ab0-46cf-9aac-238b75f7d5cb","Handler":null,"Name":"","Endpoint":""} Apr 16 16:30:02.587139 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:02.587103 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:30:02.587139 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:02.587134 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:30:02.599526 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:02.599476 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:02.599746 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:02.599511 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:02.599746 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:02.599614 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:02.599746 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:02.599711 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:03.599728 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:03.599507 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:03.600149 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:03.599779 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:03.771538 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:03.771511 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:30:03.771888 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:03.771847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"96717c17655428944564c4a4d03387b4e1d2247be7c344175926a634eb7d4cef"} Apr 16 16:30:03.773631 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:03.773607 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" event={"ID":"30ff43b1-927c-4bb7-9b93-70440addb1ed","Type":"ContainerStarted","Data":"cc5f39ff55ffa2d529165261eb67588d62364291aa87e73d0c4d14bdf2b43609"} Apr 16 16:30:03.789710 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:03.789663 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zgpkd" podStartSLOduration=3.684318479 podStartE2EDuration="23.789650408s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.018166851 +0000 UTC m=+2.962676823" lastFinishedPulling="2026-04-16 16:30:03.123498767 +0000 UTC m=+23.068008752" observedRunningTime="2026-04-16 16:30:03.789539616 +0000 UTC m=+23.734049608" watchObservedRunningTime="2026-04-16 16:30:03.789650408 +0000 UTC m=+23.734160401" Apr 16 16:30:04.600559 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:04.600524 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:04.601091 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:04.600577 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:04.601091 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:04.600674 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:04.601091 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:04.600813 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:05.600080 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:05.600047 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:05.600239 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:05.600181 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:06.600401 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.600227 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:06.600980 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.600227 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:06.600980 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:06.600500 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:06.600980 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:06.600542 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:06.782634 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.782603 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:30:06.783397 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.783370 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"cb9a8a4a9124f0eb386954dd02a9aafeb1bad33a4814031abfaa09f94de438f1"} Apr 16 16:30:06.783714 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.783690 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:30:06.783793 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.783723 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:30:06.783879 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.783866 2569 scope.go:117] "RemoveContainer" containerID="b4a5eaa3c615abb9c7c99ed494b0881c5f546e1a8b3fc59b6e408b9ae27ad248" Apr 16 16:30:06.789542 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.789505 2569 generic.go:358] "Generic (PLEG): container finished" podID="1666c169-2943-4cf3-8a4a-51fc0345056b" containerID="ccee17c9f6854b860aafe31da6290100af69a7d41961aef21323685f15d565d4" exitCode=0 Apr 16 16:30:06.789662 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.789556 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerDied","Data":"ccee17c9f6854b860aafe31da6290100af69a7d41961aef21323685f15d565d4"} Apr 16 16:30:06.801793 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:06.801773 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:30:07.217809 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.217778 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:07.217956 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:07.217910 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:30:07.217992 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:07.217971 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret podName:bc1da351-41a0-434d-9b7e-bf1cfdc791f4 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:23.217954121 +0000 UTC m=+43.162464092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret") pod "global-pull-secret-syncer-nv9gc" (UID: "bc1da351-41a0-434d-9b7e-bf1cfdc791f4") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:30:07.599870 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.599803 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:07.600031 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:07.599908 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:07.793010 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.792977 2569 generic.go:358] "Generic (PLEG): container finished" podID="1666c169-2943-4cf3-8a4a-51fc0345056b" containerID="4e541f01f8660a56dbddda399af2347d48794e5eb69c5338188ba8af2b6305d7" exitCode=0 Apr 16 16:30:07.793444 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.793054 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerDied","Data":"4e541f01f8660a56dbddda399af2347d48794e5eb69c5338188ba8af2b6305d7"} Apr 16 16:30:07.796376 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.796355 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:30:07.796688 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.796666 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" event={"ID":"18ea2b0b-1348-4827-969f-18c4a33a0dc8","Type":"ContainerStarted","Data":"dfa5c471bb0e3efb4a42188b70e02a05055c0e25be0b9e5f5a0f6b1b8d26baf8"} Apr 16 16:30:07.796876 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.796864 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:30:07.810660 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.810636 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:30:07.837001 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:07.836964 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" podStartSLOduration=10.700146329 podStartE2EDuration="27.836950689s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.013922906 +0000 UTC m=+2.958432877" lastFinishedPulling="2026-04-16 16:30:00.150727259 +0000 UTC m=+20.095237237" observedRunningTime="2026-04-16 16:30:07.836581786 +0000 UTC m=+27.781091779" watchObservedRunningTime="2026-04-16 16:30:07.836950689 +0000 UTC m=+27.781460681" Apr 16 16:30:08.599513 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:08.599481 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:08.599513 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:08.599481 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:08.599669 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:08.599630 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:08.599768 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:08.599745 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:08.800411 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:08.800324 2569 generic.go:358] "Generic (PLEG): container finished" podID="1666c169-2943-4cf3-8a4a-51fc0345056b" containerID="cdbfca2bbca13feeeced51f37adb3cf6069177af92e2af52b9e0a255cdc6bad1" exitCode=0 Apr 16 16:30:08.800411 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:08.800397 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerDied","Data":"cdbfca2bbca13feeeced51f37adb3cf6069177af92e2af52b9e0a255cdc6bad1"} Apr 16 16:30:09.600032 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:09.600004 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:09.600226 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:09.600118 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:10.601514 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:10.601468 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:10.601934 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:10.601576 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:10.601934 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:10.601476 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:10.603931 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:10.602238 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:11.599800 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:11.599766 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:11.599999 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:11.599867 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:12.599527 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:12.599496 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:12.600013 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:12.599499 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:12.600013 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:12.599630 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:12.600013 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:12.599734 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:13.600127 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:13.600093 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:13.600658 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:13.600206 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:14.268645 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:14.268611 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:14.268810 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:14.268658 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:14.268810 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:14.268751 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:30:14.268810 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:14.268788 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:30:14.268810 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:14.268801 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs podName:3fbad60e-9cf1-43dd-abb0-8d7c1caab371 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:46.268787187 +0000 UTC m=+66.213297157 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs") pod "network-metrics-daemon-ff4ns" (UID: "3fbad60e-9cf1-43dd-abb0-8d7c1caab371") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:30:14.268810 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:14.268807 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:30:14.268972 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:14.268819 2569 projected.go:194] Error preparing data for projected volume kube-api-access-2xg8r for pod openshift-network-diagnostics/network-check-target-9jq9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:30:14.268972 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:14.268874 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r podName:f81e14b6-a4d4-417f-9556-bdceafdafe3a nodeName:}" failed. No retries permitted until 2026-04-16 16:30:46.26886181 +0000 UTC m=+66.213371781 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2xg8r" (UniqueName: "kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r") pod "network-check-target-9jq9v" (UID: "f81e14b6-a4d4-417f-9556-bdceafdafe3a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:30:14.600111 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:14.600035 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:14.600268 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:14.600153 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:14.600268 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:14.600230 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:14.600663 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:14.600347 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:15.599680 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:15.599649 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:15.599836 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:15.599746 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:15.816305 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:15.816268 2569 generic.go:358] "Generic (PLEG): container finished" podID="1666c169-2943-4cf3-8a4a-51fc0345056b" containerID="ccf46d80109a8de16d909e676e06a48bf59b08be466d2c8c2c56832131b9d289" exitCode=0 Apr 16 16:30:15.816721 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:15.816308 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerDied","Data":"ccf46d80109a8de16d909e676e06a48bf59b08be466d2c8c2c56832131b9d289"} Apr 16 16:30:16.600357 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:16.600323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:16.600571 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:16.600324 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:16.600571 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:16.600473 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:16.600571 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:16.600519 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:16.820753 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:16.820720 2569 generic.go:358] "Generic (PLEG): container finished" podID="1666c169-2943-4cf3-8a4a-51fc0345056b" containerID="57e8863b58251856b2bfec1878558360dc340d3ac18ef7c3c8b8340b15dcc98a" exitCode=0 Apr 16 16:30:16.821167 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:16.820789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerDied","Data":"57e8863b58251856b2bfec1878558360dc340d3ac18ef7c3c8b8340b15dcc98a"} Apr 16 16:30:17.600506 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:17.600472 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:17.600677 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:17.600571 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:17.824929 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:17.824891 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f86f6" event={"ID":"1666c169-2943-4cf3-8a4a-51fc0345056b","Type":"ContainerStarted","Data":"d5ccecc0dd4913d930edc9f280b54e01b795617e9df7fc0f3fa7e3a182c2517b"} Apr 16 16:30:17.856073 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:17.855999 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-f86f6" podStartSLOduration=6.178097294 podStartE2EDuration="37.855986102s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:29:43.020604176 +0000 UTC m=+2.965114156" lastFinishedPulling="2026-04-16 16:30:14.698492989 +0000 UTC m=+34.643002964" observedRunningTime="2026-04-16 16:30:17.855845184 +0000 UTC m=+37.800355177" watchObservedRunningTime="2026-04-16 16:30:17.855986102 +0000 UTC m=+37.800496095" Apr 16 16:30:18.600213 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:18.600181 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:18.600390 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:18.600224 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:18.600390 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:18.600311 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:18.600537 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:18.600453 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:18.871095 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:18.870886 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ff4ns"] Apr 16 16:30:18.871655 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:18.871121 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:18.871655 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:18.871234 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:18.873651 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:18.873617 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9jq9v"] Apr 16 16:30:18.873763 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:18.873710 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:18.873835 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:18.873807 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:18.874170 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:18.874149 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nv9gc"] Apr 16 16:30:18.874260 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:18.874237 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:18.874341 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:18.874324 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:20.600811 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:20.600783 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:20.601411 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:20.600874 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jq9v" podUID="f81e14b6-a4d4-417f-9556-bdceafdafe3a" Apr 16 16:30:20.601411 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:20.600949 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:20.601411 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:20.601063 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:20.601411 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:20.601072 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4ns" podUID="3fbad60e-9cf1-43dd-abb0-8d7c1caab371" Apr 16 16:30:20.601411 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:20.601137 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv9gc" podUID="bc1da351-41a0-434d-9b7e-bf1cfdc791f4" Apr 16 16:30:22.342912 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.342833 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-191.ec2.internal" event="NodeReady" Apr 16 16:30:22.343414 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.342940 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:30:22.374390 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.374359 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-767dc6c99d-gsj9l"] Apr 16 16:30:22.377849 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.377829 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.380367 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.380260 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:30:22.380367 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.380280 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fdzgl\"" Apr 16 16:30:22.380367 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.380313 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:30:22.380367 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.380353 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:30:22.385157 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.384812 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:30:22.389622 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.389553 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-767dc6c99d-gsj9l"] Apr 16 16:30:22.390280 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.390258 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bw5bz"] Apr 16 16:30:22.393371 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.393355 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b447l"] Apr 16 16:30:22.393609 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.393592 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.395706 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.395687 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:30:22.395789 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.395735 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbbv\"" Apr 16 16:30:22.395849 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.395785 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:30:22.396558 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.396540 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b447l" Apr 16 16:30:22.398545 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.398527 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:30:22.398656 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.398529 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mk2ls\"" Apr 16 16:30:22.398656 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.398584 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:30:22.398770 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.398697 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:30:22.401340 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.401323 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bw5bz"] Apr 16 16:30:22.413187 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.413167 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b447l"] Apr 16 16:30:22.429427 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429400 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4ql\" (UniqueName: \"kubernetes.io/projected/cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958-kube-api-access-rw4ql\") pod \"ingress-canary-b447l\" (UID: \"cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958\") " pod="openshift-ingress-canary/ingress-canary-b447l" Apr 16 16:30:22.429574 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-registry-tls\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.429574 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429479 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-bound-sa-token\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.429574 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-registry-certificates\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.429574 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429511 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9c9b578-b92d-41d0-8f31-a11cc6862b71-config-volume\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.429574 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429551 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-image-registry-private-configuration\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.429785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958-cert\") pod \"ingress-canary-b447l\" (UID: \"cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958\") " pod="openshift-ingress-canary/ingress-canary-b447l" Apr 16 16:30:22.429785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429601 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c9b578-b92d-41d0-8f31-a11cc6862b71-metrics-tls\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.429785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-trusted-ca\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.429785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9c9b578-b92d-41d0-8f31-a11cc6862b71-tmp-dir\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.429785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429648 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-installation-pull-secrets\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.429785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429663 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhxq\" (UniqueName: \"kubernetes.io/projected/d9c9b578-b92d-41d0-8f31-a11cc6862b71-kube-api-access-kdhxq\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.429785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429679 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xpvm\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-kube-api-access-6xpvm\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.429785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.429705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-ca-trust-extracted\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.530219 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-registry-tls\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.530219 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530221 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-bound-sa-token\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.530484 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-registry-certificates\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.530484 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530373 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9c9b578-b92d-41d0-8f31-a11cc6862b71-config-volume\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.530612 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-image-registry-private-configuration\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.530612 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958-cert\") pod \"ingress-canary-b447l\" (UID: \"cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958\") " pod="openshift-ingress-canary/ingress-canary-b447l" Apr 16 16:30:22.530612 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530586 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c9b578-b92d-41d0-8f31-a11cc6862b71-metrics-tls\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.530612 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-trusted-ca\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.530802 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.530630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9c9b578-b92d-41d0-8f31-a11cc6862b71-tmp-dir\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.531029 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9c9b578-b92d-41d0-8f31-a11cc6862b71-config-volume\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.531189 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-installation-pull-secrets\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.531296 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhxq\" (UniqueName: \"kubernetes.io/projected/d9c9b578-b92d-41d0-8f31-a11cc6862b71-kube-api-access-kdhxq\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.531381 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xpvm\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-kube-api-access-6xpvm\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.531556 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-trusted-ca\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.531691 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531542 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-ca-trust-extracted\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.531825 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531804 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4ql\" (UniqueName: \"kubernetes.io/projected/cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958-kube-api-access-rw4ql\") pod \"ingress-canary-b447l\" (UID: \"cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958\") " pod="openshift-ingress-canary/ingress-canary-b447l" Apr 16 16:30:22.531908 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-ca-trust-extracted\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.531908 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9c9b578-b92d-41d0-8f31-a11cc6862b71-tmp-dir\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.531999 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.531988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-registry-certificates\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.534632 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.534612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c9b578-b92d-41d0-8f31-a11cc6862b71-metrics-tls\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.534733 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.534653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-installation-pull-secrets\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.534733 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.534661 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-image-registry-private-configuration\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.534733 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.534701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958-cert\") pod \"ingress-canary-b447l\" (UID: \"cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958\") " pod="openshift-ingress-canary/ingress-canary-b447l" Apr 16 16:30:22.534733 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.534701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-registry-tls\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.537189 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.537168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-bound-sa-token\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.538760 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.538740 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhxq\" (UniqueName: \"kubernetes.io/projected/d9c9b578-b92d-41d0-8f31-a11cc6862b71-kube-api-access-kdhxq\") pod \"dns-default-bw5bz\" (UID: \"d9c9b578-b92d-41d0-8f31-a11cc6862b71\") " pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.538858 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.538843 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xpvm\" (UniqueName: \"kubernetes.io/projected/eead9e2d-3f5a-4b10-abe0-1af5f3458da5-kube-api-access-6xpvm\") pod \"image-registry-767dc6c99d-gsj9l\" (UID: \"eead9e2d-3f5a-4b10-abe0-1af5f3458da5\") " pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.539130 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.539114 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4ql\" (UniqueName: \"kubernetes.io/projected/cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958-kube-api-access-rw4ql\") pod \"ingress-canary-b447l\" (UID: \"cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958\") " pod="openshift-ingress-canary/ingress-canary-b447l" Apr 16 16:30:22.600405 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.600333 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:22.600539 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.600333 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:22.600539 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.600333 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:22.602841 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.602808 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:30:22.602841 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.602836 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:30:22.602983 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.602869 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ws7rw\"" Apr 16 16:30:22.602983 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.602880 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:30:22.603175 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.603157 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:30:22.603261 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.603232 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqp8k\"" Apr 16 16:30:22.688413 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.688388 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:22.704154 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.704133 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:22.708712 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.708692 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b447l" Apr 16 16:30:22.838042 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.837666 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b447l"] Apr 16 16:30:22.838276 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.838253 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-767dc6c99d-gsj9l"] Apr 16 16:30:22.839862 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:22.839838 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bw5bz"] Apr 16 16:30:22.842389 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:22.842365 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb9c7b6f_6df0_4fa9_bdc9_90df14ca5958.slice/crio-c1ed3108a182f1dab1967ca7fe45bc860f0a0fbe9b1d6cffb20ef15755736900 WatchSource:0}: Error finding container c1ed3108a182f1dab1967ca7fe45bc860f0a0fbe9b1d6cffb20ef15755736900: Status 404 returned error can't find the container with id c1ed3108a182f1dab1967ca7fe45bc860f0a0fbe9b1d6cffb20ef15755736900 Apr 16 16:30:22.842825 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:22.842802 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeead9e2d_3f5a_4b10_abe0_1af5f3458da5.slice/crio-e1031ee16bf7887440c0054dc0dd08ee763a332b66f252ba308261c0f5ecb6ea WatchSource:0}: Error finding container e1031ee16bf7887440c0054dc0dd08ee763a332b66f252ba308261c0f5ecb6ea: Status 404 returned error can't find the container with id e1031ee16bf7887440c0054dc0dd08ee763a332b66f252ba308261c0f5ecb6ea Apr 16 16:30:22.843718 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:22.843702 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c9b578_b92d_41d0_8f31_a11cc6862b71.slice/crio-ed77167712ec870f375018b424104878dfeaafd178160f9646ba43317efa4804 WatchSource:0}: Error finding container ed77167712ec870f375018b424104878dfeaafd178160f9646ba43317efa4804: Status 404 returned error can't find the container with id ed77167712ec870f375018b424104878dfeaafd178160f9646ba43317efa4804 Apr 16 16:30:23.237862 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.237815 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:23.241895 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.241870 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1da351-41a0-434d-9b7e-bf1cfdc791f4-original-pull-secret\") pod \"global-pull-secret-syncer-nv9gc\" (UID: \"bc1da351-41a0-434d-9b7e-bf1cfdc791f4\") " pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:23.515248 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.515164 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv9gc" Apr 16 16:30:23.658636 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.658599 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nv9gc"] Apr 16 16:30:23.669910 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:23.669875 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1da351_41a0_434d_9b7e_bf1cfdc791f4.slice/crio-af1267a34ec05902877d4d3baddaf120ca96eb009c2f896055d7c6d41fbdbf65 WatchSource:0}: Error finding container af1267a34ec05902877d4d3baddaf120ca96eb009c2f896055d7c6d41fbdbf65: Status 404 returned error can't find the container with id af1267a34ec05902877d4d3baddaf120ca96eb009c2f896055d7c6d41fbdbf65 Apr 16 16:30:23.837531 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.837442 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nv9gc" event={"ID":"bc1da351-41a0-434d-9b7e-bf1cfdc791f4","Type":"ContainerStarted","Data":"af1267a34ec05902877d4d3baddaf120ca96eb009c2f896055d7c6d41fbdbf65"} Apr 16 16:30:23.839129 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.839093 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" event={"ID":"eead9e2d-3f5a-4b10-abe0-1af5f3458da5","Type":"ContainerStarted","Data":"e5e0453cfe55c9d09236fcfe7e962be834f76aaa17002a4c3b3ce2d85553fdd6"} Apr 16 16:30:23.839129 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.839130 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" event={"ID":"eead9e2d-3f5a-4b10-abe0-1af5f3458da5","Type":"ContainerStarted","Data":"e1031ee16bf7887440c0054dc0dd08ee763a332b66f252ba308261c0f5ecb6ea"} Apr 16 16:30:23.839366 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.839276 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:23.840308 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.840239 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b447l" event={"ID":"cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958","Type":"ContainerStarted","Data":"c1ed3108a182f1dab1967ca7fe45bc860f0a0fbe9b1d6cffb20ef15755736900"} Apr 16 16:30:23.841476 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.841454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bw5bz" event={"ID":"d9c9b578-b92d-41d0-8f31-a11cc6862b71","Type":"ContainerStarted","Data":"ed77167712ec870f375018b424104878dfeaafd178160f9646ba43317efa4804"} Apr 16 16:30:23.860328 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:23.860277 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" podStartSLOduration=6.860259592 podStartE2EDuration="6.860259592s" podCreationTimestamp="2026-04-16 16:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:30:23.859034212 +0000 UTC m=+43.803544240" watchObservedRunningTime="2026-04-16 16:30:23.860259592 +0000 UTC m=+43.804769586" Apr 16 16:30:24.846234 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:24.845523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b447l" event={"ID":"cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958","Type":"ContainerStarted","Data":"ed744117cb0fa26cc68a3527c504dd11801568e8f277ca3909b660935683c0f6"} Apr 16 16:30:24.848862 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:24.848825 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bw5bz" event={"ID":"d9c9b578-b92d-41d0-8f31-a11cc6862b71","Type":"ContainerStarted","Data":"1225d8853134854b1a1fefc834f11b2cf0fe17c7ba49a785ee16f16d04e6d9fd"} Apr 16 16:30:24.863264 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:24.863214 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b447l" podStartSLOduration=1.074481514 podStartE2EDuration="2.863197105s" podCreationTimestamp="2026-04-16 16:30:22 +0000 UTC" firstStartedPulling="2026-04-16 16:30:22.84450985 +0000 UTC m=+42.789019824" lastFinishedPulling="2026-04-16 16:30:24.633225442 +0000 UTC m=+44.577735415" observedRunningTime="2026-04-16 16:30:24.86311957 +0000 UTC m=+44.807629564" watchObservedRunningTime="2026-04-16 16:30:24.863197105 +0000 UTC m=+44.807707097" Apr 16 16:30:25.852982 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:25.852945 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bw5bz" event={"ID":"d9c9b578-b92d-41d0-8f31-a11cc6862b71","Type":"ContainerStarted","Data":"539ccfcec1367d4df9fc502fd95c2440f0da8ecc8ecf024c9bbab1a79669cc0a"} Apr 16 16:30:25.867898 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:25.867844 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bw5bz" podStartSLOduration=2.083140967 podStartE2EDuration="3.86782852s" podCreationTimestamp="2026-04-16 16:30:22 +0000 UTC" firstStartedPulling="2026-04-16 16:30:22.845510733 +0000 UTC m=+42.790020704" lastFinishedPulling="2026-04-16 16:30:24.630198286 +0000 UTC m=+44.574708257" observedRunningTime="2026-04-16 16:30:25.867071159 +0000 UTC m=+45.811581151" watchObservedRunningTime="2026-04-16 16:30:25.86782852 +0000 UTC m=+45.812338515" Apr 16 16:30:25.972874 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:25.972812 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bw5bz_d9c9b578-b92d-41d0-8f31-a11cc6862b71/kube-rbac-proxy/0.log" Apr 16 16:30:26.855649 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:26.855613 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:27.371288 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.371258 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zcdxw_d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c/dns-node-resolver/0.log" Apr 16 16:30:27.772731 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.772694 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-767dc6c99d-gsj9l_eead9e2d-3f5a-4b10-abe0-1af5f3458da5/registry/0.log" Apr 16 16:30:27.950647 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.950608 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jlfdr"] Apr 16 16:30:27.968778 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.968749 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jlfdr"] Apr 16 16:30:27.968912 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.968889 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:27.971586 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.971505 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:30:27.971586 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.971553 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:30:27.971586 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.971574 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:30:27.971931 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.971555 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tj77f\"" Apr 16 16:30:27.971931 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:27.971555 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:30:28.076534 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.076458 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-data-volume\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.076534 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.076505 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.076758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.076567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prt7p\" (UniqueName: \"kubernetes.io/projected/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-kube-api-access-prt7p\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.076758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.076622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.076758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.076693 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-crio-socket\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.177022 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.176986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.177197 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.177059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-crio-socket\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.177197 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.177101 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-data-volume\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.177197 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.177138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.177197 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.177161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prt7p\" (UniqueName: \"kubernetes.io/projected/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-kube-api-access-prt7p\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.177382 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.177263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-crio-socket\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.177514 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.177484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-data-volume\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.177811 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.177794 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.180368 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.180343 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.185383 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.185363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prt7p\" (UniqueName: \"kubernetes.io/projected/e53c6569-aafb-4b9a-8cd8-4e2f9772d993-kube-api-access-prt7p\") pod \"insights-runtime-extractor-jlfdr\" (UID: \"e53c6569-aafb-4b9a-8cd8-4e2f9772d993\") " pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.278419 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.278385 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jlfdr" Apr 16 16:30:28.372591 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.372564 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r42bt_4240101a-1b9a-426e-bf0c-bc8b7b372154/node-ca/0.log" Apr 16 16:30:28.425176 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.425151 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jlfdr"] Apr 16 16:30:28.428279 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:28.428240 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode53c6569_aafb_4b9a_8cd8_4e2f9772d993.slice/crio-e1eb67a7fa07b920aab5c20f9f1b272931f3c5177d16fe907feecfc728b2ea0a WatchSource:0}: Error finding container e1eb67a7fa07b920aab5c20f9f1b272931f3c5177d16fe907feecfc728b2ea0a: Status 404 returned error can't find the container with id e1eb67a7fa07b920aab5c20f9f1b272931f3c5177d16fe907feecfc728b2ea0a Apr 16 16:30:28.772192 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.772159 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b447l_cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958/serve-healthcheck-canary/0.log" Apr 16 16:30:28.861790 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.861753 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jlfdr" event={"ID":"e53c6569-aafb-4b9a-8cd8-4e2f9772d993","Type":"ContainerStarted","Data":"307f3634621c7a351eaf38fc27592a6a22497303b5911d3cfe56daf6c97a358a"} Apr 16 16:30:28.861790 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.861796 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jlfdr" event={"ID":"e53c6569-aafb-4b9a-8cd8-4e2f9772d993","Type":"ContainerStarted","Data":"e1eb67a7fa07b920aab5c20f9f1b272931f3c5177d16fe907feecfc728b2ea0a"} Apr 16 16:30:28.863010 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.862985 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nv9gc" event={"ID":"bc1da351-41a0-434d-9b7e-bf1cfdc791f4","Type":"ContainerStarted","Data":"353f9dbb3d1e6317bdd96225dd90700fe7bca4024bcce1fc22ca29b72476f859"} Apr 16 16:30:28.877317 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:28.877277 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nv9gc" podStartSLOduration=33.327057517 podStartE2EDuration="37.877263381s" podCreationTimestamp="2026-04-16 16:29:51 +0000 UTC" firstStartedPulling="2026-04-16 16:30:23.695222096 +0000 UTC m=+43.639732073" lastFinishedPulling="2026-04-16 16:30:28.245427955 +0000 UTC m=+48.189937937" observedRunningTime="2026-04-16 16:30:28.876157438 +0000 UTC m=+48.820667431" watchObservedRunningTime="2026-04-16 16:30:28.877263381 +0000 UTC m=+48.821773374" Apr 16 16:30:29.869804 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:29.869770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jlfdr" event={"ID":"e53c6569-aafb-4b9a-8cd8-4e2f9772d993","Type":"ContainerStarted","Data":"878f26203d9294b56d903eac8d4778a6212c52b01d29c161ef38046b26811e1a"} Apr 16 16:30:31.878123 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:31.877884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jlfdr" event={"ID":"e53c6569-aafb-4b9a-8cd8-4e2f9772d993","Type":"ContainerStarted","Data":"d0edaed210240162ea80f1df1043e92c21649d889d7aca0b7ee7ca02bf719613"} Apr 16 16:30:31.892809 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:31.892761 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jlfdr" podStartSLOduration=1.739712256 podStartE2EDuration="4.892745524s" podCreationTimestamp="2026-04-16 16:30:27 +0000 UTC" firstStartedPulling="2026-04-16 16:30:28.520307729 +0000 UTC m=+48.464817703" lastFinishedPulling="2026-04-16 16:30:31.673340976 +0000 UTC m=+51.617850971" observedRunningTime="2026-04-16 16:30:31.892205104 +0000 UTC m=+51.836715096" watchObservedRunningTime="2026-04-16 16:30:31.892745524 +0000 UTC m=+51.837255517" Apr 16 16:30:35.680366 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.680331 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6hznm"] Apr 16 16:30:35.694685 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.694659 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9kznk"] Apr 16 16:30:35.694846 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.694821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.697150 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.697127 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:30:35.697300 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.697210 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:30:35.697300 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.697255 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:30:35.697399 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.697343 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 16:30:35.698137 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.698118 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:30:35.698248 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.698192 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-gvqm7\"" Apr 16 16:30:35.698248 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.698214 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 16:30:35.707217 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.707199 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6hznm"] Apr 16 16:30:35.707328 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.707314 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.709514 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.709492 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-95zfb\"" Apr 16 16:30:35.709684 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.709664 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:30:35.709684 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.709678 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:30:35.709813 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.709747 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:30:35.731128 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.731079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.731128 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.731115 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npcj\" (UniqueName: \"kubernetes.io/projected/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-api-access-9npcj\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.731377 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.731145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.731377 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.731173 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.731377 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.731252 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9c32a6-1b36-4e37-9070-e1fc11116efa-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.731377 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.731310 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ed9c32a6-1b36-4e37-9070-e1fc11116efa-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.832219 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-wtmp\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.832403 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9npcj\" (UniqueName: \"kubernetes.io/projected/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-api-access-9npcj\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.832403 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ed9c32a6-1b36-4e37-9070-e1fc11116efa-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.832403 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.832564 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-root\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.832564 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-tls\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.832564 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:35.832508 2569 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 16:30:35.832564 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-metrics-client-ca\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.832739 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9c32a6-1b36-4e37-9070-e1fc11116efa-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.832739 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:35.832591 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-tls podName:ed9c32a6-1b36-4e37-9070-e1fc11116efa nodeName:}" failed. No retries permitted until 2026-04-16 16:30:36.332558488 +0000 UTC m=+56.277068463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-6hznm" (UID: "ed9c32a6-1b36-4e37-9070-e1fc11116efa") : secret "kube-state-metrics-tls" not found Apr 16 16:30:35.832739 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832622 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ed9c32a6-1b36-4e37-9070-e1fc11116efa-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.832739 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832678 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.832739 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-textfile\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.832944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn94v\" (UniqueName: \"kubernetes.io/projected/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-kube-api-access-rn94v\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.832944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-sys\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.832944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.832944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832892 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.833142 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.832970 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.833200 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.833156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9c32a6-1b36-4e37-9070-e1fc11116efa-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.833468 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.833423 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.847025 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.847000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.847183 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.847165 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npcj\" (UniqueName: \"kubernetes.io/projected/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-api-access-9npcj\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:35.934080 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-wtmp\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934080 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934064 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-root\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934280 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934088 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-tls\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934280 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934113 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-metrics-client-ca\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934280 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934141 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-root\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934280 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934149 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934280 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934212 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-wtmp\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934280 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:35.934251 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:30:35.934572 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-textfile\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934572 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:35.934314 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-tls podName:8ac68bd4-1081-4135-b7ed-90d2c1e552d7 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:36.434291643 +0000 UTC m=+56.378801628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-tls") pod "node-exporter-9kznk" (UID: "8ac68bd4-1081-4135-b7ed-90d2c1e552d7") : secret "node-exporter-tls" not found Apr 16 16:30:35.934572 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn94v\" (UniqueName: \"kubernetes.io/projected/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-kube-api-access-rn94v\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934572 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934408 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-sys\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934572 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934572 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-sys\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934788 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934610 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-textfile\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934788 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-metrics-client-ca\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.934952 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.934930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.936792 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.936772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:35.944680 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:35.944658 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn94v\" (UniqueName: \"kubernetes.io/projected/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-kube-api-access-rn94v\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:36.338776 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.338696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:36.341239 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.341207 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed9c32a6-1b36-4e37-9070-e1fc11116efa-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6hznm\" (UID: \"ed9c32a6-1b36-4e37-9070-e1fc11116efa\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:36.439656 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.439615 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-tls\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:36.441711 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.441685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8ac68bd4-1081-4135-b7ed-90d2c1e552d7-node-exporter-tls\") pod \"node-exporter-9kznk\" (UID: \"8ac68bd4-1081-4135-b7ed-90d2c1e552d7\") " pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:36.606078 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.606002 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" Apr 16 16:30:36.616115 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.616079 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9kznk" Apr 16 16:30:36.632375 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:36.632332 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac68bd4_1081_4135_b7ed_90d2c1e552d7.slice/crio-707ca6aac650e997773670caefec8162b5296cb3217adb92ed7b730ff0b7c934 WatchSource:0}: Error finding container 707ca6aac650e997773670caefec8162b5296cb3217adb92ed7b730ff0b7c934: Status 404 returned error can't find the container with id 707ca6aac650e997773670caefec8162b5296cb3217adb92ed7b730ff0b7c934 Apr 16 16:30:36.742075 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.742042 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:30:36.753279 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:36.753249 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded9c32a6_1b36_4e37_9070_e1fc11116efa.slice/crio-29b9cd2e36aa6a6a148c8316c37e3d45c75fe550a4e27aecaa1d06043025dd8a WatchSource:0}: Error finding container 29b9cd2e36aa6a6a148c8316c37e3d45c75fe550a4e27aecaa1d06043025dd8a: Status 404 returned error can't find the container with id 29b9cd2e36aa6a6a148c8316c37e3d45c75fe550a4e27aecaa1d06043025dd8a Apr 16 16:30:36.776342 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.776315 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6hznm"] Apr 16 16:30:36.776342 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.776353 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:30:36.776607 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.776559 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.778696 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.779566 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dk6gr\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.779840 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.779957 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.780011 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.780184 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.780306 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.780333 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.780379 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:30:36.782055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.780460 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844191 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844252 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844284 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-web-config\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844315 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844395 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-out\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844416 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844475 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844533 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.846449 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.844633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px7m5\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-kube-api-access-px7m5\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.863601 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.862762 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bw5bz" Apr 16 16:30:36.895564 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.895503 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" event={"ID":"ed9c32a6-1b36-4e37-9070-e1fc11116efa","Type":"ContainerStarted","Data":"29b9cd2e36aa6a6a148c8316c37e3d45c75fe550a4e27aecaa1d06043025dd8a"} Apr 16 16:30:36.897129 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.897102 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kznk" event={"ID":"8ac68bd4-1081-4135-b7ed-90d2c1e552d7","Type":"ContainerStarted","Data":"707ca6aac650e997773670caefec8162b5296cb3217adb92ed7b730ff0b7c934"} Apr 16 16:30:36.946795 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.945881 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px7m5\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-kube-api-access-px7m5\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.947411 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.947373 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.952901 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.952874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.952960 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.952949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-web-config\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953001 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.952987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953058 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953058 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953153 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953069 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-out\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953153 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953092 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953258 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953154 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953258 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953184 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953258 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953480 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953256 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953480 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.953820 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:36.953796 2569 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 16:30:36.953908 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:36.953871 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls podName:dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:37.453849519 +0000 UTC m=+57.398359505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9") : secret "alertmanager-main-tls" not found Apr 16 16:30:36.953908 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.953880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.954285 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:36.954266 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle podName:dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:37.454245812 +0000 UTC m=+57.398755802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9") : configmap references non-existent config key: ca-bundle.crt Apr 16 16:30:36.956225 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.956177 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.956225 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.956194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.956374 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.956196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.956870 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.956835 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-web-config\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.956966 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.956941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.957086 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.957068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px7m5\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-kube-api-access-px7m5\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.957086 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.957077 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-out\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.958351 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.958303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:36.958469 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:36.958450 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:37.458180 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:37.458147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:37.458180 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:37.458195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:37.458464 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:37.458392 2569 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 16:30:37.458535 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:30:37.458473 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls podName:dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:38.458454971 +0000 UTC m=+58.402964947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9") : secret "alertmanager-main-tls" not found Apr 16 16:30:37.459066 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:37.459042 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:38.467246 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.467212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:38.469441 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.469409 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:38.589326 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.589304 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:30:38.751678 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.751651 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:30:38.807406 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:38.807379 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd93b02e_b83e_4ad0_a3ed_db6472dfa3b9.slice/crio-bae43eb70fd32b8dc40f49e18c5002b5f3d692e76920b813d8b48d1a87ba6b02 WatchSource:0}: Error finding container bae43eb70fd32b8dc40f49e18c5002b5f3d692e76920b813d8b48d1a87ba6b02: Status 404 returned error can't find the container with id bae43eb70fd32b8dc40f49e18c5002b5f3d692e76920b813d8b48d1a87ba6b02 Apr 16 16:30:38.904783 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.904745 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" event={"ID":"ed9c32a6-1b36-4e37-9070-e1fc11116efa","Type":"ContainerStarted","Data":"d2f79d654d772b45895d2bb2ac82cec73b481ce0a33c15cd1e0724a5eafe35ae"} Apr 16 16:30:38.904783 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.904789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" event={"ID":"ed9c32a6-1b36-4e37-9070-e1fc11116efa","Type":"ContainerStarted","Data":"1f749557dbb40da994c04b95f0f0ee276ee1e2e3a120d4a22fb9ded01303cc95"} Apr 16 16:30:38.905035 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.904805 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" event={"ID":"ed9c32a6-1b36-4e37-9070-e1fc11116efa","Type":"ContainerStarted","Data":"960013e494d3bd283001dae62442086dab9337ca2c2081def420adcb91e59c3f"} Apr 16 16:30:38.905906 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.905874 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerStarted","Data":"bae43eb70fd32b8dc40f49e18c5002b5f3d692e76920b813d8b48d1a87ba6b02"} Apr 16 16:30:38.907173 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.907149 2569 generic.go:358] "Generic (PLEG): container finished" podID="8ac68bd4-1081-4135-b7ed-90d2c1e552d7" containerID="6da94a68d606d849ddae29228ac03ab5b832b8ff9bd63dc986804e22ac0522e2" exitCode=0 Apr 16 16:30:38.907266 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.907194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kznk" event={"ID":"8ac68bd4-1081-4135-b7ed-90d2c1e552d7","Type":"ContainerDied","Data":"6da94a68d606d849ddae29228ac03ab5b832b8ff9bd63dc986804e22ac0522e2"} Apr 16 16:30:38.919957 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:38.919914 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-6hznm" podStartSLOduration=2.223912574 podStartE2EDuration="3.919901256s" podCreationTimestamp="2026-04-16 16:30:35 +0000 UTC" firstStartedPulling="2026-04-16 16:30:36.755669989 +0000 UTC m=+56.700179976" lastFinishedPulling="2026-04-16 16:30:38.451658683 +0000 UTC m=+58.396168658" observedRunningTime="2026-04-16 16:30:38.919224764 +0000 UTC m=+58.863734757" watchObservedRunningTime="2026-04-16 16:30:38.919901256 +0000 UTC m=+58.864411248" Apr 16 16:30:39.816943 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:39.816863 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b42xz" Apr 16 16:30:39.912153 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:39.912117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kznk" event={"ID":"8ac68bd4-1081-4135-b7ed-90d2c1e552d7","Type":"ContainerStarted","Data":"d939ae32374f8c07561315f84d39dc5fc423ed7e590b28d90dae1ebf49119627"} Apr 16 16:30:39.912153 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:39.912158 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kznk" event={"ID":"8ac68bd4-1081-4135-b7ed-90d2c1e552d7","Type":"ContainerStarted","Data":"49ac7ba803441ac7b4ad7aabf7d974de91a0e83477331a1f543c68f19dfc74f9"} Apr 16 16:30:39.930082 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:39.930031 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9kznk" podStartSLOduration=3.11585405 podStartE2EDuration="4.930011663s" podCreationTimestamp="2026-04-16 16:30:35 +0000 UTC" firstStartedPulling="2026-04-16 16:30:36.635036509 +0000 UTC m=+56.579546481" lastFinishedPulling="2026-04-16 16:30:38.449194122 +0000 UTC m=+58.393704094" observedRunningTime="2026-04-16 16:30:39.928635152 +0000 UTC m=+59.873145145" watchObservedRunningTime="2026-04-16 16:30:39.930011663 +0000 UTC m=+59.874521653" Apr 16 16:30:39.989634 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:39.989605 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-84986867f4-jqlsh"] Apr 16 16:30:40.003592 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.003559 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-84986867f4-jqlsh"] Apr 16 16:30:40.003729 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.003655 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.005888 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.005861 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 16:30:40.005888 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.005861 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:30:40.006077 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.005958 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-qj9g5\"" Apr 16 16:30:40.006077 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.005979 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 16:30:40.006077 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.006047 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 16:30:40.006290 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.006273 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4ss1r2ttrimg1\"" Apr 16 16:30:40.083894 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.083814 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-secret-metrics-server-tls\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.084040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.083920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-secret-metrics-server-client-certs\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.084040 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.083994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5210c885-bbff-4603-a4c8-b48a036b3f53-audit-log\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.084142 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.084057 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-client-ca-bundle\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.084142 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.084100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5210c885-bbff-4603-a4c8-b48a036b3f53-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.084142 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.084124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzqr\" (UniqueName: \"kubernetes.io/projected/5210c885-bbff-4603-a4c8-b48a036b3f53-kube-api-access-pfzqr\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.084284 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.084190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5210c885-bbff-4603-a4c8-b48a036b3f53-metrics-server-audit-profiles\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.185093 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.185052 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5210c885-bbff-4603-a4c8-b48a036b3f53-audit-log\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.185250 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.185118 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-client-ca-bundle\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.185250 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.185147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5210c885-bbff-4603-a4c8-b48a036b3f53-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.185250 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.185175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzqr\" (UniqueName: \"kubernetes.io/projected/5210c885-bbff-4603-a4c8-b48a036b3f53-kube-api-access-pfzqr\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.185250 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.185210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5210c885-bbff-4603-a4c8-b48a036b3f53-metrics-server-audit-profiles\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.185498 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.185266 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-secret-metrics-server-tls\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.185498 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.185304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-secret-metrics-server-client-certs\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.185498 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.185484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5210c885-bbff-4603-a4c8-b48a036b3f53-audit-log\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.186259 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.186188 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5210c885-bbff-4603-a4c8-b48a036b3f53-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.186490 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.186454 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5210c885-bbff-4603-a4c8-b48a036b3f53-metrics-server-audit-profiles\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.188220 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.188197 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-secret-metrics-server-client-certs\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.188874 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.188849 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-secret-metrics-server-tls\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.188980 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.188879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5210c885-bbff-4603-a4c8-b48a036b3f53-client-ca-bundle\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.195727 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.195707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzqr\" (UniqueName: \"kubernetes.io/projected/5210c885-bbff-4603-a4c8-b48a036b3f53-kube-api-access-pfzqr\") pod \"metrics-server-84986867f4-jqlsh\" (UID: \"5210c885-bbff-4603-a4c8-b48a036b3f53\") " pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.312975 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.312936 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:30:40.449399 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.449372 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz"] Apr 16 16:30:40.465180 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.465151 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz"] Apr 16 16:30:40.465314 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.465287 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" Apr 16 16:30:40.467470 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.467363 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:30:40.467470 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.467372 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-48lj9\"" Apr 16 16:30:40.516381 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.514788 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-84986867f4-jqlsh"] Apr 16 16:30:40.517151 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:40.517123 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5210c885_bbff_4603_a4c8_b48a036b3f53.slice/crio-ad5627ea2ef5831234d64b06f6a874f44c836a802e44981573b6b79220ae9ec6 WatchSource:0}: Error finding container ad5627ea2ef5831234d64b06f6a874f44c836a802e44981573b6b79220ae9ec6: Status 404 returned error can't find the container with id ad5627ea2ef5831234d64b06f6a874f44c836a802e44981573b6b79220ae9ec6 Apr 16 16:30:40.589180 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.589150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-hlljz\" (UID: \"b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" Apr 16 16:30:40.689595 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.689552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-hlljz\" (UID: \"b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" Apr 16 16:30:40.691667 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.691651 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:30:40.712565 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.712543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-hlljz\" (UID: \"b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" Apr 16 16:30:40.777535 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.777514 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-48lj9\"" Apr 16 16:30:40.786257 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.786232 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" Apr 16 16:30:40.902474 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.902446 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz"] Apr 16 16:30:40.905325 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:40.905299 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb57d10b7_23ab_4f4d_9d24_c0e8d2ec879a.slice/crio-54163bd5758599a68b048c55bbc5e29f3d0244d30fd922830fcc6ed82f730d1a WatchSource:0}: Error finding container 54163bd5758599a68b048c55bbc5e29f3d0244d30fd922830fcc6ed82f730d1a: Status 404 returned error can't find the container with id 54163bd5758599a68b048c55bbc5e29f3d0244d30fd922830fcc6ed82f730d1a Apr 16 16:30:40.915738 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.915713 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerID="366407ec89a09b6c481f6b908172f6574ed48850f84ee83dc983f2cbf0d80e17" exitCode=0 Apr 16 16:30:40.915823 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.915786 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerDied","Data":"366407ec89a09b6c481f6b908172f6574ed48850f84ee83dc983f2cbf0d80e17"} Apr 16 16:30:40.916911 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.916879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" event={"ID":"b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a","Type":"ContainerStarted","Data":"54163bd5758599a68b048c55bbc5e29f3d0244d30fd922830fcc6ed82f730d1a"} Apr 16 16:30:40.917881 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:40.917858 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" event={"ID":"5210c885-bbff-4603-a4c8-b48a036b3f53","Type":"ContainerStarted","Data":"ad5627ea2ef5831234d64b06f6a874f44c836a802e44981573b6b79220ae9ec6"} Apr 16 16:30:42.929563 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.929533 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerStarted","Data":"9d8d6ad8aab3b5edbc8062be9e129097f6e4acf511bd197cedf5936bda20a435"} Apr 16 16:30:42.929904 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.929574 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerStarted","Data":"928e16fbab40a7ba65fca96e5f8acd83e9fbf5a0e2578888b74cea128f761114"} Apr 16 16:30:42.929904 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.929589 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerStarted","Data":"3b81c969cc7b854ff5389a14c2a252aed118b6cab28ea99d1fe2a1ba26ccd88b"} Apr 16 16:30:42.929904 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.929601 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerStarted","Data":"dd0ebdcb876ca7ed268a142ee73b52ec131e771d94042ef30182a0de7b9f4842"} Apr 16 16:30:42.931251 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.931225 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" event={"ID":"b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a","Type":"ContainerStarted","Data":"f3d0466a7546a4100b8a9a03531ab75c55068a7aaaae9b7827f487b0a27d69d4"} Apr 16 16:30:42.932178 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.932161 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" Apr 16 16:30:42.934991 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.934390 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" event={"ID":"5210c885-bbff-4603-a4c8-b48a036b3f53","Type":"ContainerStarted","Data":"ab087be4d5600e154ddaae3aba6ce3cd3560a0452f2a7fa8a5246f3d52136946"} Apr 16 16:30:42.938631 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.938614 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" Apr 16 16:30:42.959386 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.959341 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-hlljz" podStartSLOduration=1.1951615389999999 podStartE2EDuration="2.959323187s" podCreationTimestamp="2026-04-16 16:30:40 +0000 UTC" firstStartedPulling="2026-04-16 16:30:40.907185911 +0000 UTC m=+60.851695884" lastFinishedPulling="2026-04-16 16:30:42.671347551 +0000 UTC m=+62.615857532" observedRunningTime="2026-04-16 16:30:42.944538521 +0000 UTC m=+62.889048514" watchObservedRunningTime="2026-04-16 16:30:42.959323187 +0000 UTC m=+62.903833181" Apr 16 16:30:42.973315 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:42.973218 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" podStartSLOduration=1.8211007989999999 podStartE2EDuration="3.973205454s" podCreationTimestamp="2026-04-16 16:30:39 +0000 UTC" firstStartedPulling="2026-04-16 16:30:40.519244139 +0000 UTC m=+60.463754110" lastFinishedPulling="2026-04-16 16:30:42.671348786 +0000 UTC m=+62.615858765" observedRunningTime="2026-04-16 16:30:42.972498356 +0000 UTC m=+62.917008350" watchObservedRunningTime="2026-04-16 16:30:42.973205454 +0000 UTC m=+62.917715446" Apr 16 16:30:43.940454 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:43.940397 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerStarted","Data":"c8b97660886c54585ffd34f2a5fba149487dcbeec954b64f38334dddb880de76"} Apr 16 16:30:43.940919 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:43.940465 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerStarted","Data":"6e0116d873c95384ec216c39f157d4fca2b736adb7d3180dcf945a2dfad3bb76"} Apr 16 16:30:43.966099 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:43.966053 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.002961807 podStartE2EDuration="7.966036818s" podCreationTimestamp="2026-04-16 16:30:36 +0000 UTC" firstStartedPulling="2026-04-16 16:30:38.809528886 +0000 UTC m=+58.754038856" lastFinishedPulling="2026-04-16 16:30:43.772603888 +0000 UTC m=+63.717113867" observedRunningTime="2026-04-16 16:30:43.965824919 +0000 UTC m=+63.910334911" watchObservedRunningTime="2026-04-16 16:30:43.966036818 +0000 UTC m=+63.910546811" Apr 16 16:30:44.853298 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:44.853273 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-767dc6c99d-gsj9l" Apr 16 16:30:46.339590 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.339547 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:46.340093 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.339611 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:46.341966 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.341944 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:30:46.342081 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.342005 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:30:46.352545 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.352515 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:30:46.352830 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.352812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbad60e-9cf1-43dd-abb0-8d7c1caab371-metrics-certs\") pod \"network-metrics-daemon-ff4ns\" (UID: \"3fbad60e-9cf1-43dd-abb0-8d7c1caab371\") " pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:46.363762 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.363739 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xg8r\" (UniqueName: \"kubernetes.io/projected/f81e14b6-a4d4-417f-9556-bdceafdafe3a-kube-api-access-2xg8r\") pod \"network-check-target-9jq9v\" (UID: \"f81e14b6-a4d4-417f-9556-bdceafdafe3a\") " pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:46.612733 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.612653 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqp8k\"" Apr 16 16:30:46.620441 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.620412 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4ns" Apr 16 16:30:46.622220 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.622194 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ws7rw\"" Apr 16 16:30:46.630865 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.630839 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:46.764272 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.764242 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ff4ns"] Apr 16 16:30:46.768474 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:46.768423 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fbad60e_9cf1_43dd_abb0_8d7c1caab371.slice/crio-d35ba51e4e019321b65db78dc217dbd4b45b2e6a47c976f2fc3b7628e1f462ba WatchSource:0}: Error finding container d35ba51e4e019321b65db78dc217dbd4b45b2e6a47c976f2fc3b7628e1f462ba: Status 404 returned error can't find the container with id d35ba51e4e019321b65db78dc217dbd4b45b2e6a47c976f2fc3b7628e1f462ba Apr 16 16:30:46.781610 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.781586 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9jq9v"] Apr 16 16:30:46.784336 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:30:46.784306 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81e14b6_a4d4_417f_9556_bdceafdafe3a.slice/crio-7aa555de3c6154b94d6d2192b54d4d7813b208829509cc819d626c507cd80c1f WatchSource:0}: Error finding container 7aa555de3c6154b94d6d2192b54d4d7813b208829509cc819d626c507cd80c1f: Status 404 returned error can't find the container with id 7aa555de3c6154b94d6d2192b54d4d7813b208829509cc819d626c507cd80c1f Apr 16 16:30:46.950536 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.950501 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9jq9v" event={"ID":"f81e14b6-a4d4-417f-9556-bdceafdafe3a","Type":"ContainerStarted","Data":"7aa555de3c6154b94d6d2192b54d4d7813b208829509cc819d626c507cd80c1f"} Apr 16 16:30:46.951540 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:46.951514 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4ns" event={"ID":"3fbad60e-9cf1-43dd-abb0-8d7c1caab371","Type":"ContainerStarted","Data":"d35ba51e4e019321b65db78dc217dbd4b45b2e6a47c976f2fc3b7628e1f462ba"} Apr 16 16:30:48.962907 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:48.962866 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4ns" event={"ID":"3fbad60e-9cf1-43dd-abb0-8d7c1caab371","Type":"ContainerStarted","Data":"ae272daafed2bd0154c8c27c8acd5ef3f32b35e88f70aa91390d2f776ac3da6b"} Apr 16 16:30:48.962907 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:48.962913 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4ns" event={"ID":"3fbad60e-9cf1-43dd-abb0-8d7c1caab371","Type":"ContainerStarted","Data":"dcba1cbf3c32658a7760c4888385bde6e7c87308047abfc78d1472778ca53967"} Apr 16 16:30:48.977803 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:48.977760 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ff4ns" podStartSLOduration=67.823850869 podStartE2EDuration="1m8.977745554s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:30:46.771088018 +0000 UTC m=+66.715597994" lastFinishedPulling="2026-04-16 16:30:47.924982704 +0000 UTC m=+67.869492679" observedRunningTime="2026-04-16 16:30:48.976594337 +0000 UTC m=+68.921104364" watchObservedRunningTime="2026-04-16 16:30:48.977745554 +0000 UTC m=+68.922255547" Apr 16 16:30:49.967374 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:49.967342 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9jq9v" event={"ID":"f81e14b6-a4d4-417f-9556-bdceafdafe3a","Type":"ContainerStarted","Data":"c4ece30cf5cf26ce5949eaf1201616bb3a02cf749b4597608f65e37cf8e44d54"} Apr 16 16:30:49.967763 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:49.967440 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:30:49.982039 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:30:49.981997 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9jq9v" podStartSLOduration=67.276333335 podStartE2EDuration="1m9.981984286s" podCreationTimestamp="2026-04-16 16:29:40 +0000 UTC" firstStartedPulling="2026-04-16 16:30:46.786195589 +0000 UTC m=+66.730705561" lastFinishedPulling="2026-04-16 16:30:49.491846533 +0000 UTC m=+69.436356512" observedRunningTime="2026-04-16 16:30:49.981464003 +0000 UTC m=+69.925973995" watchObservedRunningTime="2026-04-16 16:30:49.981984286 +0000 UTC m=+69.926494278" Apr 16 16:31:00.313949 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:00.313911 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:31:00.313949 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:00.313957 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:31:20.319079 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:20.319044 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:31:20.323201 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:20.323182 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-84986867f4-jqlsh" Apr 16 16:31:20.972288 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:20.972259 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9jq9v" Apr 16 16:31:45.917502 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:45.917470 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:31:45.918646 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:45.918609 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="alertmanager" containerID="cri-o://dd0ebdcb876ca7ed268a142ee73b52ec131e771d94042ef30182a0de7b9f4842" gracePeriod=120 Apr 16 16:31:45.919006 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:45.918956 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="config-reloader" containerID="cri-o://3b81c969cc7b854ff5389a14c2a252aed118b6cab28ea99d1fe2a1ba26ccd88b" gracePeriod=120 Apr 16 16:31:45.919120 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:45.919043 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy-web" containerID="cri-o://928e16fbab40a7ba65fca96e5f8acd83e9fbf5a0e2578888b74cea128f761114" gracePeriod=120 Apr 16 16:31:45.919120 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:45.919033 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="prom-label-proxy" containerID="cri-o://c8b97660886c54585ffd34f2a5fba149487dcbeec954b64f38334dddb880de76" gracePeriod=120 Apr 16 16:31:45.919120 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:45.919082 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy-metric" containerID="cri-o://6e0116d873c95384ec216c39f157d4fca2b736adb7d3180dcf945a2dfad3bb76" gracePeriod=120 Apr 16 16:31:45.919326 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:45.918976 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy" containerID="cri-o://9d8d6ad8aab3b5edbc8062be9e129097f6e4acf511bd197cedf5936bda20a435" gracePeriod=120 Apr 16 16:31:46.121621 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:46.121590 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerID="c8b97660886c54585ffd34f2a5fba149487dcbeec954b64f38334dddb880de76" exitCode=0 Apr 16 16:31:46.121621 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:46.121613 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerID="9d8d6ad8aab3b5edbc8062be9e129097f6e4acf511bd197cedf5936bda20a435" exitCode=0 Apr 16 16:31:46.121621 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:46.121619 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerID="3b81c969cc7b854ff5389a14c2a252aed118b6cab28ea99d1fe2a1ba26ccd88b" exitCode=0 Apr 16 16:31:46.121621 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:46.121625 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerID="dd0ebdcb876ca7ed268a142ee73b52ec131e771d94042ef30182a0de7b9f4842" exitCode=0 Apr 16 16:31:46.121857 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:46.121663 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerDied","Data":"c8b97660886c54585ffd34f2a5fba149487dcbeec954b64f38334dddb880de76"} Apr 16 16:31:46.121857 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:46.121696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerDied","Data":"9d8d6ad8aab3b5edbc8062be9e129097f6e4acf511bd197cedf5936bda20a435"} Apr 16 16:31:46.121857 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:46.121708 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerDied","Data":"3b81c969cc7b854ff5389a14c2a252aed118b6cab28ea99d1fe2a1ba26ccd88b"} Apr 16 16:31:46.121857 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:46.121721 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerDied","Data":"dd0ebdcb876ca7ed268a142ee73b52ec131e771d94042ef30182a0de7b9f4842"} Apr 16 16:31:47.127624 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.127597 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerID="6e0116d873c95384ec216c39f157d4fca2b736adb7d3180dcf945a2dfad3bb76" exitCode=0 Apr 16 16:31:47.127624 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.127620 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerID="928e16fbab40a7ba65fca96e5f8acd83e9fbf5a0e2578888b74cea128f761114" exitCode=0 Apr 16 16:31:47.128013 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.127672 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerDied","Data":"6e0116d873c95384ec216c39f157d4fca2b736adb7d3180dcf945a2dfad3bb76"} Apr 16 16:31:47.128013 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.127713 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerDied","Data":"928e16fbab40a7ba65fca96e5f8acd83e9fbf5a0e2578888b74cea128f761114"} Apr 16 16:31:47.149328 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.149309 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:47.184397 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184369 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-web-config\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184397 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184403 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184626 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184466 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-out\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184626 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184496 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184740 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184617 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184740 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184674 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184740 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184701 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-volume\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184740 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184731 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-cluster-tls-config\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184766 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-main-db\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184794 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184833 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px7m5\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-kube-api-access-px7m5\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184857 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-metrics-client-ca\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.184944 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.184887 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-tls-assets\") pod \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\" (UID: \"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9\") " Apr 16 16:31:47.186244 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.186208 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:31:47.186690 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.186665 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:31:47.186779 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.186765 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:47.187542 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.187492 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:47.188590 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.188558 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:47.189010 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.188974 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-out" (OuterVolumeSpecName: "config-out") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:47.189221 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.189182 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:47.190191 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.190064 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-volume" (OuterVolumeSpecName: "config-volume") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:47.190191 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.190134 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:47.190191 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.190156 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:47.191372 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.191343 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-kube-api-access-px7m5" (OuterVolumeSpecName: "kube-api-access-px7m5") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "kube-api-access-px7m5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:47.195662 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.195621 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:47.199793 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.199772 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-web-config" (OuterVolumeSpecName: "web-config") pod "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" (UID: "dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:47.286343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286254 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-out\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286297 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286308 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286320 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286330 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-config-volume\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286339 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-cluster-tls-config\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286347 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-alertmanager-main-db\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286356 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286713 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286366 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-px7m5\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-kube-api-access-px7m5\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286713 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286374 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-metrics-client-ca\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286713 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286390 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-tls-assets\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286713 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286398 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-web-config\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:47.286713 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:47.286406 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9-secret-alertmanager-main-tls\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:31:48.132396 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.132359 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9","Type":"ContainerDied","Data":"bae43eb70fd32b8dc40f49e18c5002b5f3d692e76920b813d8b48d1a87ba6b02"} Apr 16 16:31:48.132827 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.132418 2569 scope.go:117] "RemoveContainer" containerID="c8b97660886c54585ffd34f2a5fba149487dcbeec954b64f38334dddb880de76" Apr 16 16:31:48.132827 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.132477 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.140182 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.140166 2569 scope.go:117] "RemoveContainer" containerID="6e0116d873c95384ec216c39f157d4fca2b736adb7d3180dcf945a2dfad3bb76" Apr 16 16:31:48.147247 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.147231 2569 scope.go:117] "RemoveContainer" containerID="9d8d6ad8aab3b5edbc8062be9e129097f6e4acf511bd197cedf5936bda20a435" Apr 16 16:31:48.153128 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.153112 2569 scope.go:117] "RemoveContainer" containerID="928e16fbab40a7ba65fca96e5f8acd83e9fbf5a0e2578888b74cea128f761114" Apr 16 16:31:48.158752 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.158734 2569 scope.go:117] "RemoveContainer" containerID="3b81c969cc7b854ff5389a14c2a252aed118b6cab28ea99d1fe2a1ba26ccd88b" Apr 16 16:31:48.161343 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.161319 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:31:48.166189 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.166171 2569 scope.go:117] "RemoveContainer" containerID="dd0ebdcb876ca7ed268a142ee73b52ec131e771d94042ef30182a0de7b9f4842" Apr 16 16:31:48.166346 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.166327 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:31:48.172242 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.172227 2569 scope.go:117] "RemoveContainer" containerID="366407ec89a09b6c481f6b908172f6574ed48850f84ee83dc983f2cbf0d80e17" Apr 16 16:31:48.192061 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192026 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:31:48.192422 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192404 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy-metric" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192425 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy-metric" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192457 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="init-config-reloader" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192466 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="init-config-reloader" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192475 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy-web" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192483 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy-web" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192491 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="alertmanager" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192497 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="alertmanager" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192510 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="config-reloader" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192515 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="config-reloader" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192521 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192526 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192535 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="prom-label-proxy" Apr 16 16:31:48.192543 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192540 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="prom-label-proxy" Apr 16 16:31:48.192931 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192583 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="config-reloader" Apr 16 16:31:48.192931 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192590 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="prom-label-proxy" Apr 16 16:31:48.192931 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192597 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy-web" Apr 16 16:31:48.192931 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192603 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy" Apr 16 16:31:48.192931 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192608 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="kube-rbac-proxy-metric" Apr 16 16:31:48.192931 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.192616 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" containerName="alertmanager" Apr 16 16:31:48.195656 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.195639 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.198712 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.198692 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:31:48.198825 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.198808 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:31:48.199031 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.199017 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:31:48.199382 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.199359 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:31:48.199518 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.199504 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dk6gr\"" Apr 16 16:31:48.199623 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.199603 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:31:48.199777 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.199760 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:31:48.199901 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.199886 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:31:48.199980 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.199968 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:31:48.205214 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.205194 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:31:48.209347 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.209326 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:31:48.294766 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.294685 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.294766 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.294729 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.294766 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.294761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-web-config\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.294954 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.294787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.294954 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.294827 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.294954 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.294868 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.295049 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.294970 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.295049 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.294995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.295049 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.295013 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.295049 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.295029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l79jz\" (UniqueName: \"kubernetes.io/projected/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-kube-api-access-l79jz\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.295228 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.295053 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-config-out\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.295228 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.295124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.295228 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.295169 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-config-volume\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.395840 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.395793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-config-out\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.395840 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.395847 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396082 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.395873 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-config-volume\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396082 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.395910 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396082 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.395933 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396082 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.395958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-web-config\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396277 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396277 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396385 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396385 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396385 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396363 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396728 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396809 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396809 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.396914 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.396847 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l79jz\" (UniqueName: \"kubernetes.io/projected/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-kube-api-access-l79jz\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.398102 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.398036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.398954 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.398804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-config-out\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.398954 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.398908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.399149 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.399114 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.399245 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.399191 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.399551 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.399530 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.399624 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.399555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-config-volume\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.400152 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.400134 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.400308 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.400293 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.400564 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.400547 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-web-config\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.406399 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.406369 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l79jz\" (UniqueName: \"kubernetes.io/projected/5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1-kube-api-access-l79jz\") pod \"alertmanager-main-0\" (UID: \"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.506042 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.506007 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:31:48.604622 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.604593 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9" path="/var/lib/kubelet/pods/dd93b02e-b83e-4ad0-a3ed-db6472dfa3b9/volumes" Apr 16 16:31:48.630622 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:48.630600 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:31:48.633102 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:31:48.633069 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9cdb47_3a95_4f8c_8b36_6dc0ee150dd1.slice/crio-60752257feffddbf9eff25d7e6d789b188244ec9db97b4a496eeb30f74c85fd4 WatchSource:0}: Error finding container 60752257feffddbf9eff25d7e6d789b188244ec9db97b4a496eeb30f74c85fd4: Status 404 returned error can't find the container with id 60752257feffddbf9eff25d7e6d789b188244ec9db97b4a496eeb30f74c85fd4 Apr 16 16:31:49.136827 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.136791 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1" containerID="2a8a0d81107f66e7ddc4206e4b4b6c579bd77ac8df75002276a634923a418984" exitCode=0 Apr 16 16:31:49.137199 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.136878 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1","Type":"ContainerDied","Data":"2a8a0d81107f66e7ddc4206e4b4b6c579bd77ac8df75002276a634923a418984"} Apr 16 16:31:49.137199 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.136909 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1","Type":"ContainerStarted","Data":"60752257feffddbf9eff25d7e6d789b188244ec9db97b4a496eeb30f74c85fd4"} Apr 16 16:31:49.944450 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.944407 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd"] Apr 16 16:31:49.947668 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.947646 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:49.951819 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.951802 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dsfgb\"" Apr 16 16:31:49.952001 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.951983 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 16:31:49.952135 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.952122 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 16:31:49.953206 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.953189 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 16:31:49.953268 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.953189 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 16:31:49.954261 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.954233 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 16:31:49.971983 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.971960 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 16:31:49.981550 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:49.981525 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd"] Apr 16 16:31:50.010520 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.010492 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d47t5\" (UniqueName: \"kubernetes.io/projected/26950289-2205-4f21-8af0-9d60b932ac3d-kube-api-access-d47t5\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.010674 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.010527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-secret-telemeter-client\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.010674 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.010580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-federate-client-tls\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.010674 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.010605 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-serving-certs-ca-bundle\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.010674 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.010641 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-metrics-client-ca\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.010864 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.010679 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-telemeter-client-tls\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.010864 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.010713 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.010864 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.010730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.111295 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.111254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d47t5\" (UniqueName: \"kubernetes.io/projected/26950289-2205-4f21-8af0-9d60b932ac3d-kube-api-access-d47t5\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.111520 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.111315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-secret-telemeter-client\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.111520 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.111341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-federate-client-tls\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.111520 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.111364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-serving-certs-ca-bundle\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.111520 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.111403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-metrics-client-ca\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.111718 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.111547 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-telemeter-client-tls\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.111718 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.111606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.111718 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.111636 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.112186 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.112163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-serving-certs-ca-bundle\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.112257 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.112234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-metrics-client-ca\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.112385 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.112368 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26950289-2205-4f21-8af0-9d60b932ac3d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.114631 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.114598 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.114745 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.114653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-telemeter-client-tls\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.114800 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.114753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-secret-telemeter-client\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.114800 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.114763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/26950289-2205-4f21-8af0-9d60b932ac3d-federate-client-tls\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.119554 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.119532 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d47t5\" (UniqueName: \"kubernetes.io/projected/26950289-2205-4f21-8af0-9d60b932ac3d-kube-api-access-d47t5\") pod \"telemeter-client-5d8f8c7b85-d62nd\" (UID: \"26950289-2205-4f21-8af0-9d60b932ac3d\") " pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.143257 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.143223 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1","Type":"ContainerStarted","Data":"e1be94c0cdf8ad716975aea5c46d2f1ecc5c7959697a57c24720b398e73971f2"} Apr 16 16:31:50.143581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.143262 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1","Type":"ContainerStarted","Data":"ce28a18d4f492021a6c2607af1163feac039d60b03f43f8dc74e6becd2d8729f"} Apr 16 16:31:50.143581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.143278 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1","Type":"ContainerStarted","Data":"49a4b90e2b873b063351b81069e896d7ed05a6c5ba95753d5dea0f553fa81d75"} Apr 16 16:31:50.143581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.143291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1","Type":"ContainerStarted","Data":"b69f38e536459652ee8c12bbf95c5701788797dec68f6814c8a0d88171427ce3"} Apr 16 16:31:50.143581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.143302 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1","Type":"ContainerStarted","Data":"d5fdb2cd396f145fa0a21d26dda3c106a6db50e7855e056315993d7ada3d65f2"} Apr 16 16:31:50.143581 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.143314 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1","Type":"ContainerStarted","Data":"a1a13f6b66ba447302b6d1096ea85188d0537ffdf7831545098f295bad166967"} Apr 16 16:31:50.170417 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.170355 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.170336859 podStartE2EDuration="2.170336859s" podCreationTimestamp="2026-04-16 16:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:31:50.168670621 +0000 UTC m=+130.113180648" watchObservedRunningTime="2026-04-16 16:31:50.170336859 +0000 UTC m=+130.114846853" Apr 16 16:31:50.257009 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.256932 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" Apr 16 16:31:50.374321 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:50.374291 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd"] Apr 16 16:31:50.376867 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:31:50.376837 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26950289_2205_4f21_8af0_9d60b932ac3d.slice/crio-027cdc79201eee8bc3ad547edc49d1b72a723564612d35d822cae2241c88ac5c WatchSource:0}: Error finding container 027cdc79201eee8bc3ad547edc49d1b72a723564612d35d822cae2241c88ac5c: Status 404 returned error can't find the container with id 027cdc79201eee8bc3ad547edc49d1b72a723564612d35d822cae2241c88ac5c Apr 16 16:31:51.149494 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:51.149452 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" event={"ID":"26950289-2205-4f21-8af0-9d60b932ac3d","Type":"ContainerStarted","Data":"027cdc79201eee8bc3ad547edc49d1b72a723564612d35d822cae2241c88ac5c"} Apr 16 16:31:52.154098 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:52.154060 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" event={"ID":"26950289-2205-4f21-8af0-9d60b932ac3d","Type":"ContainerStarted","Data":"9798c9f1f1564d8471ea3f6fa0503d9d19345384f8ba1306e3c2162a2db4cf5e"} Apr 16 16:31:52.154542 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:52.154105 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" event={"ID":"26950289-2205-4f21-8af0-9d60b932ac3d","Type":"ContainerStarted","Data":"eeb94071475a89c13e1e8406ffdcc1c2d38debd54ac8dd97765329b6becfac52"} Apr 16 16:31:52.154542 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:52.154119 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" event={"ID":"26950289-2205-4f21-8af0-9d60b932ac3d","Type":"ContainerStarted","Data":"638b78736b10f6da5b0683abc3f56a0cf436c03b7bd6d0704815e71b8551f843"} Apr 16 16:31:52.173073 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:31:52.173018 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5d8f8c7b85-d62nd" podStartSLOduration=1.7475401750000001 podStartE2EDuration="3.172999273s" podCreationTimestamp="2026-04-16 16:31:49 +0000 UTC" firstStartedPulling="2026-04-16 16:31:50.378612798 +0000 UTC m=+130.323122769" lastFinishedPulling="2026-04-16 16:31:51.804071876 +0000 UTC m=+131.748581867" observedRunningTime="2026-04-16 16:31:52.171586122 +0000 UTC m=+132.116096115" watchObservedRunningTime="2026-04-16 16:31:52.172999273 +0000 UTC m=+132.117509268" Apr 16 16:33:46.504417 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.504379 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc"] Apr 16 16:33:46.507794 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.507770 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.510362 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.510343 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:33:46.510488 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.510452 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:33:46.511307 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.511291 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-cn7qh\"" Apr 16 16:33:46.521981 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.521960 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc"] Apr 16 16:33:46.603113 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.603086 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltwqm\" (UniqueName: \"kubernetes.io/projected/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-kube-api-access-ltwqm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.603260 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.603118 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.603260 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.603140 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.703713 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.703680 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltwqm\" (UniqueName: \"kubernetes.io/projected/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-kube-api-access-ltwqm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.703859 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.703724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.703859 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.703756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.704121 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.704102 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.704184 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.704134 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.711526 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.711498 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltwqm\" (UniqueName: \"kubernetes.io/projected/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-kube-api-access-ltwqm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.816214 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.816148 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:33:46.933901 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:46.933881 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc"] Apr 16 16:33:46.936341 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:33:46.936314 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc2d2e6_4484_4c41_a4f1_f9594e42525b.slice/crio-2ae66be95aa94c5b019bb49bdab16395aeedd80e433f4245f0c1c17e1ff04a6e WatchSource:0}: Error finding container 2ae66be95aa94c5b019bb49bdab16395aeedd80e433f4245f0c1c17e1ff04a6e: Status 404 returned error can't find the container with id 2ae66be95aa94c5b019bb49bdab16395aeedd80e433f4245f0c1c17e1ff04a6e Apr 16 16:33:47.466176 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:47.466129 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" event={"ID":"6fc2d2e6-4484-4c41-a4f1-f9594e42525b","Type":"ContainerStarted","Data":"2ae66be95aa94c5b019bb49bdab16395aeedd80e433f4245f0c1c17e1ff04a6e"} Apr 16 16:33:52.483842 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:52.483803 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerID="4aa446128f869e13018ac90420952e09ca0b5debcb72133c8b3db07616514a7c" exitCode=0 Apr 16 16:33:52.484291 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:52.483858 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" event={"ID":"6fc2d2e6-4484-4c41-a4f1-f9594e42525b","Type":"ContainerDied","Data":"4aa446128f869e13018ac90420952e09ca0b5debcb72133c8b3db07616514a7c"} Apr 16 16:33:54.491084 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:54.491003 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerID="e2da56f2e976b9ad4be6a313781ad88e71f6e36e9b73acbdc93c76c5d684c2ff" exitCode=0 Apr 16 16:33:54.491084 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:33:54.491063 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" event={"ID":"6fc2d2e6-4484-4c41-a4f1-f9594e42525b","Type":"ContainerDied","Data":"e2da56f2e976b9ad4be6a313781ad88e71f6e36e9b73acbdc93c76c5d684c2ff"} Apr 16 16:34:00.512715 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:00.512680 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" event={"ID":"6fc2d2e6-4484-4c41-a4f1-f9594e42525b","Type":"ContainerStarted","Data":"42ff39887a334887876a4115f41bb57f3042c57aabd466fc3c8c0ab5c6a68c85"} Apr 16 16:34:00.529558 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:00.529461 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" podStartSLOduration=1.041894912 podStartE2EDuration="14.529445756s" podCreationTimestamp="2026-04-16 16:33:46 +0000 UTC" firstStartedPulling="2026-04-16 16:33:46.938204952 +0000 UTC m=+246.882714923" lastFinishedPulling="2026-04-16 16:34:00.425755779 +0000 UTC m=+260.370265767" observedRunningTime="2026-04-16 16:34:00.527868367 +0000 UTC m=+260.472378382" watchObservedRunningTime="2026-04-16 16:34:00.529445756 +0000 UTC m=+260.473955741" Apr 16 16:34:01.517381 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:01.517341 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerID="42ff39887a334887876a4115f41bb57f3042c57aabd466fc3c8c0ab5c6a68c85" exitCode=0 Apr 16 16:34:01.517766 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:01.517397 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" event={"ID":"6fc2d2e6-4484-4c41-a4f1-f9594e42525b","Type":"ContainerDied","Data":"42ff39887a334887876a4115f41bb57f3042c57aabd466fc3c8c0ab5c6a68c85"} Apr 16 16:34:02.636417 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.636395 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:34:02.735681 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.735650 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-util\") pod \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " Apr 16 16:34:02.735681 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.735683 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-bundle\") pod \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " Apr 16 16:34:02.735878 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.735745 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltwqm\" (UniqueName: \"kubernetes.io/projected/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-kube-api-access-ltwqm\") pod \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\" (UID: \"6fc2d2e6-4484-4c41-a4f1-f9594e42525b\") " Apr 16 16:34:02.736283 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.736246 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-bundle" (OuterVolumeSpecName: "bundle") pod "6fc2d2e6-4484-4c41-a4f1-f9594e42525b" (UID: "6fc2d2e6-4484-4c41-a4f1-f9594e42525b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:02.737858 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.737829 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-kube-api-access-ltwqm" (OuterVolumeSpecName: "kube-api-access-ltwqm") pod "6fc2d2e6-4484-4c41-a4f1-f9594e42525b" (UID: "6fc2d2e6-4484-4c41-a4f1-f9594e42525b"). InnerVolumeSpecName "kube-api-access-ltwqm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:34:02.739585 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.739564 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-util" (OuterVolumeSpecName: "util") pod "6fc2d2e6-4484-4c41-a4f1-f9594e42525b" (UID: "6fc2d2e6-4484-4c41-a4f1-f9594e42525b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:02.836881 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.836788 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltwqm\" (UniqueName: \"kubernetes.io/projected/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-kube-api-access-ltwqm\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:34:02.836881 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.836824 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-util\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:34:02.836881 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:02.836837 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc2d2e6-4484-4c41-a4f1-f9594e42525b-bundle\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:34:03.524267 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:03.524245 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" Apr 16 16:34:03.524450 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:03.524235 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswbtc" event={"ID":"6fc2d2e6-4484-4c41-a4f1-f9594e42525b","Type":"ContainerDied","Data":"2ae66be95aa94c5b019bb49bdab16395aeedd80e433f4245f0c1c17e1ff04a6e"} Apr 16 16:34:03.524450 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:03.524352 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ae66be95aa94c5b019bb49bdab16395aeedd80e433f4245f0c1c17e1ff04a6e" Apr 16 16:34:08.025128 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.025073 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v"] Apr 16 16:34:08.025504 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.025358 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerName="pull" Apr 16 16:34:08.025504 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.025369 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerName="pull" Apr 16 16:34:08.025504 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.025376 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerName="extract" Apr 16 16:34:08.025504 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.025381 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerName="extract" Apr 16 16:34:08.025504 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.025393 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerName="util" Apr 16 16:34:08.025504 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.025399 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerName="util" Apr 16 16:34:08.025504 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.025462 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fc2d2e6-4484-4c41-a4f1-f9594e42525b" containerName="extract" Apr 16 16:34:08.049191 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.049166 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v"] Apr 16 16:34:08.049335 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.049279 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:08.051585 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.051561 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 16:34:08.051585 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.051576 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:34:08.051738 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.051604 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:34:08.051792 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.051773 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-vzdsd\"" Apr 16 16:34:08.077291 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.077264 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lh2g\" (UniqueName: \"kubernetes.io/projected/4de00110-599e-41d4-b614-b8a1508b9f05-kube-api-access-2lh2g\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x966v\" (UID: \"4de00110-599e-41d4-b614-b8a1508b9f05\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:08.077422 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.077365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4de00110-599e-41d4-b614-b8a1508b9f05-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x966v\" (UID: \"4de00110-599e-41d4-b614-b8a1508b9f05\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:08.178683 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.178642 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4de00110-599e-41d4-b614-b8a1508b9f05-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x966v\" (UID: \"4de00110-599e-41d4-b614-b8a1508b9f05\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:08.178871 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.178695 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lh2g\" (UniqueName: \"kubernetes.io/projected/4de00110-599e-41d4-b614-b8a1508b9f05-kube-api-access-2lh2g\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x966v\" (UID: \"4de00110-599e-41d4-b614-b8a1508b9f05\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:08.180939 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.180913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4de00110-599e-41d4-b614-b8a1508b9f05-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x966v\" (UID: \"4de00110-599e-41d4-b614-b8a1508b9f05\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:08.186815 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.186796 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lh2g\" (UniqueName: \"kubernetes.io/projected/4de00110-599e-41d4-b614-b8a1508b9f05-kube-api-access-2lh2g\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x966v\" (UID: \"4de00110-599e-41d4-b614-b8a1508b9f05\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:08.359269 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.359182 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:08.474047 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.474014 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v"] Apr 16 16:34:08.477710 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:34:08.477684 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de00110_599e_41d4_b614_b8a1508b9f05.slice/crio-b2ee488e0cac1f72f0cff3a97fd8ea82561b9cace79b519640cb7b245a14e580 WatchSource:0}: Error finding container b2ee488e0cac1f72f0cff3a97fd8ea82561b9cace79b519640cb7b245a14e580: Status 404 returned error can't find the container with id b2ee488e0cac1f72f0cff3a97fd8ea82561b9cace79b519640cb7b245a14e580 Apr 16 16:34:08.538760 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:08.538730 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" event={"ID":"4de00110-599e-41d4-b614-b8a1508b9f05","Type":"ContainerStarted","Data":"b2ee488e0cac1f72f0cff3a97fd8ea82561b9cace79b519640cb7b245a14e580"} Apr 16 16:34:12.548111 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.548078 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-s4nd2"] Apr 16 16:34:12.566512 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.566476 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" event={"ID":"4de00110-599e-41d4-b614-b8a1508b9f05","Type":"ContainerStarted","Data":"26b6ac0e10d40c60c75e5f721a491038c87fe3a13da441bae94cb8f2d15046cd"} Apr 16 16:34:12.566512 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.566511 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-s4nd2"] Apr 16 16:34:12.566701 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.566649 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:12.566783 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.566766 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.568802 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.568778 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 16:34:12.568913 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.568779 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rbnn6\"" Apr 16 16:34:12.568913 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.568870 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:34:12.583708 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.583663 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" podStartSLOduration=1.072715165 podStartE2EDuration="4.583649439s" podCreationTimestamp="2026-04-16 16:34:08 +0000 UTC" firstStartedPulling="2026-04-16 16:34:08.479666895 +0000 UTC m=+268.424176880" lastFinishedPulling="2026-04-16 16:34:11.990601182 +0000 UTC m=+271.935111154" observedRunningTime="2026-04-16 16:34:12.582456839 +0000 UTC m=+272.526966832" watchObservedRunningTime="2026-04-16 16:34:12.583649439 +0000 UTC m=+272.528159429" Apr 16 16:34:12.615827 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.615800 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.616009 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.615857 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwz9d\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-kube-api-access-xwz9d\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.616009 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.615902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8b08aba5-1f6a-4291-a585-fc2cba58ea19-cabundle0\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.716936 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.716902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8b08aba5-1f6a-4291-a585-fc2cba58ea19-cabundle0\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.717114 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.716971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.717114 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.716994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwz9d\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-kube-api-access-xwz9d\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.717114 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:12.717105 2569 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 16:34:12.717229 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:12.717123 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:34:12.717229 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:12.717130 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:34:12.717229 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:12.717144 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-s4nd2: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 16:34:12.717229 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:12.717189 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates podName:8b08aba5-1f6a-4291-a585-fc2cba58ea19 nodeName:}" failed. No retries permitted until 2026-04-16 16:34:13.21717537 +0000 UTC m=+273.161685340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates") pod "keda-operator-ffbb595cb-s4nd2" (UID: "8b08aba5-1f6a-4291-a585-fc2cba58ea19") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 16:34:12.717587 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.717570 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8b08aba5-1f6a-4291-a585-fc2cba58ea19-cabundle0\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.725183 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.725162 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwz9d\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-kube-api-access-xwz9d\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:12.867858 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.867781 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc"] Apr 16 16:34:12.894855 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.894826 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc"] Apr 16 16:34:12.894985 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.894940 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:12.897347 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:12.897325 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 16:34:13.020246 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.020211 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/28a92454-54f9-4ddc-99dd-5c2e7731349b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.020246 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.020256 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drf8z\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-kube-api-access-drf8z\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.020500 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.020337 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.100776 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.100745 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6wx7k"] Apr 16 16:34:13.113010 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.112981 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6wx7k"] Apr 16 16:34:13.113161 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.113118 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:13.116317 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.116297 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 16:34:13.122567 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.120905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/28a92454-54f9-4ddc-99dd-5c2e7731349b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.122567 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.120939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drf8z\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-kube-api-access-drf8z\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.122567 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.120977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.122567 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.121068 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:34:13.122567 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.121082 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:34:13.122567 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.121102 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc: references non-existent secret key: tls.crt Apr 16 16:34:13.122567 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.121149 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates podName:28a92454-54f9-4ddc-99dd-5c2e7731349b nodeName:}" failed. No retries permitted until 2026-04-16 16:34:13.621133016 +0000 UTC m=+273.565642988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates") pod "keda-metrics-apiserver-7c9f485588-flfxc" (UID: "28a92454-54f9-4ddc-99dd-5c2e7731349b") : references non-existent secret key: tls.crt Apr 16 16:34:13.122567 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.121692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/28a92454-54f9-4ddc-99dd-5c2e7731349b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.138623 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.138595 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drf8z\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-kube-api-access-drf8z\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.222163 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.222125 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:13.222163 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.222166 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-certificates\") pod \"keda-admission-cf49989db-6wx7k\" (UID: \"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7\") " pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:13.222422 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.222190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvtzw\" (UniqueName: \"kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-kube-api-access-kvtzw\") pod \"keda-admission-cf49989db-6wx7k\" (UID: \"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7\") " pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:13.222422 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.222288 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:34:13.222422 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.222308 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:34:13.222422 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.222319 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-s4nd2: references non-existent secret key: ca.crt Apr 16 16:34:13.222422 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.222373 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates podName:8b08aba5-1f6a-4291-a585-fc2cba58ea19 nodeName:}" failed. No retries permitted until 2026-04-16 16:34:14.22235959 +0000 UTC m=+274.166869565 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates") pod "keda-operator-ffbb595cb-s4nd2" (UID: "8b08aba5-1f6a-4291-a585-fc2cba58ea19") : references non-existent secret key: ca.crt Apr 16 16:34:13.322803 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.322763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-certificates\") pod \"keda-admission-cf49989db-6wx7k\" (UID: \"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7\") " pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:13.322803 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.322801 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvtzw\" (UniqueName: \"kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-kube-api-access-kvtzw\") pod \"keda-admission-cf49989db-6wx7k\" (UID: \"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7\") " pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:13.323058 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.322938 2569 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 16:34:13.323058 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.322974 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-6wx7k: secret "keda-admission-webhooks-certs" not found Apr 16 16:34:13.323058 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.323038 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-certificates podName:d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7 nodeName:}" failed. No retries permitted until 2026-04-16 16:34:13.823017316 +0000 UTC m=+273.767527305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-certificates") pod "keda-admission-cf49989db-6wx7k" (UID: "d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7") : secret "keda-admission-webhooks-certs" not found Apr 16 16:34:13.337326 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.337298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvtzw\" (UniqueName: \"kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-kube-api-access-kvtzw\") pod \"keda-admission-cf49989db-6wx7k\" (UID: \"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7\") " pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:13.625530 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.625499 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:13.626006 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.625620 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:34:13.626006 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.625632 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:34:13.626006 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.625648 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc: references non-existent secret key: tls.crt Apr 16 16:34:13.626006 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:13.625694 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates podName:28a92454-54f9-4ddc-99dd-5c2e7731349b nodeName:}" failed. No retries permitted until 2026-04-16 16:34:14.625680331 +0000 UTC m=+274.570190304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates") pod "keda-metrics-apiserver-7c9f485588-flfxc" (UID: "28a92454-54f9-4ddc-99dd-5c2e7731349b") : references non-existent secret key: tls.crt Apr 16 16:34:13.828023 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.827989 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-certificates\") pod \"keda-admission-cf49989db-6wx7k\" (UID: \"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7\") " pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:13.830300 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:13.830270 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7-certificates\") pod \"keda-admission-cf49989db-6wx7k\" (UID: \"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7\") " pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:14.025240 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:14.025212 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:14.142837 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:14.142800 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6wx7k"] Apr 16 16:34:14.151187 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:34:14.145916 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f58d9f_8d1f_4ceb_9bf5_4c179e03ace7.slice/crio-1a13bc08cb2e70097f1a7cb63795079daef34aabd6950a4f91439b4a0ac95c28 WatchSource:0}: Error finding container 1a13bc08cb2e70097f1a7cb63795079daef34aabd6950a4f91439b4a0ac95c28: Status 404 returned error can't find the container with id 1a13bc08cb2e70097f1a7cb63795079daef34aabd6950a4f91439b4a0ac95c28 Apr 16 16:34:14.231452 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:14.231408 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:14.231587 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:14.231574 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:34:14.231638 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:14.231590 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:34:14.231638 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:14.231599 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-s4nd2: references non-existent secret key: ca.crt Apr 16 16:34:14.231705 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:14.231651 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates podName:8b08aba5-1f6a-4291-a585-fc2cba58ea19 nodeName:}" failed. No retries permitted until 2026-04-16 16:34:16.231633745 +0000 UTC m=+276.176143728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates") pod "keda-operator-ffbb595cb-s4nd2" (UID: "8b08aba5-1f6a-4291-a585-fc2cba58ea19") : references non-existent secret key: ca.crt Apr 16 16:34:14.558888 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:14.558853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6wx7k" event={"ID":"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7","Type":"ContainerStarted","Data":"1a13bc08cb2e70097f1a7cb63795079daef34aabd6950a4f91439b4a0ac95c28"} Apr 16 16:34:14.635034 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:14.635000 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:14.635474 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:14.635142 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:34:14.635474 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:14.635163 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:34:14.635474 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:14.635207 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc: references non-existent secret key: tls.crt Apr 16 16:34:14.635474 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:14.635273 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates podName:28a92454-54f9-4ddc-99dd-5c2e7731349b nodeName:}" failed. No retries permitted until 2026-04-16 16:34:16.635252028 +0000 UTC m=+276.579762020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates") pod "keda-metrics-apiserver-7c9f485588-flfxc" (UID: "28a92454-54f9-4ddc-99dd-5c2e7731349b") : references non-existent secret key: tls.crt Apr 16 16:34:16.250314 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:16.250280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:16.250686 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:16.250462 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:34:16.250686 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:16.250487 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:34:16.250686 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:16.250500 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-s4nd2: references non-existent secret key: ca.crt Apr 16 16:34:16.250686 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:16.250568 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates podName:8b08aba5-1f6a-4291-a585-fc2cba58ea19 nodeName:}" failed. No retries permitted until 2026-04-16 16:34:20.250547446 +0000 UTC m=+280.195057424 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates") pod "keda-operator-ffbb595cb-s4nd2" (UID: "8b08aba5-1f6a-4291-a585-fc2cba58ea19") : references non-existent secret key: ca.crt Apr 16 16:34:16.653955 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:16.653930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:16.654063 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:16.654047 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:34:16.654063 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:16.654060 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:34:16.654141 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:16.654079 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc: references non-existent secret key: tls.crt Apr 16 16:34:16.654141 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:34:16.654119 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates podName:28a92454-54f9-4ddc-99dd-5c2e7731349b nodeName:}" failed. No retries permitted until 2026-04-16 16:34:20.654107263 +0000 UTC m=+280.598617234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates") pod "keda-metrics-apiserver-7c9f485588-flfxc" (UID: "28a92454-54f9-4ddc-99dd-5c2e7731349b") : references non-existent secret key: tls.crt Apr 16 16:34:17.568859 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:17.568822 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6wx7k" event={"ID":"d3f58d9f-8d1f-4ceb-9bf5-4c179e03ace7","Type":"ContainerStarted","Data":"f3c8f568f12a4e4542bbe4fce2a48b8550033601b96e3df238e9f06016023161"} Apr 16 16:34:17.569231 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:17.568946 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:17.585738 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:17.585692 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6wx7k" podStartSLOduration=2.163283212 podStartE2EDuration="4.585678025s" podCreationTimestamp="2026-04-16 16:34:13 +0000 UTC" firstStartedPulling="2026-04-16 16:34:14.153074525 +0000 UTC m=+274.097584496" lastFinishedPulling="2026-04-16 16:34:16.57546933 +0000 UTC m=+276.519979309" observedRunningTime="2026-04-16 16:34:17.584559735 +0000 UTC m=+277.529069739" watchObservedRunningTime="2026-04-16 16:34:17.585678025 +0000 UTC m=+277.530188022" Apr 16 16:34:20.283313 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.283280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:20.285621 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.285604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8b08aba5-1f6a-4291-a585-fc2cba58ea19-certificates\") pod \"keda-operator-ffbb595cb-s4nd2\" (UID: \"8b08aba5-1f6a-4291-a585-fc2cba58ea19\") " pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:20.376648 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.376619 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:20.493275 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.493249 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-s4nd2"] Apr 16 16:34:20.496219 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:34:20.496189 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b08aba5_1f6a_4291_a585_fc2cba58ea19.slice/crio-749d2f88571d78f6826ae0ffde2868d9d9dc51a459c2d096eaf95d5a4d3c1576 WatchSource:0}: Error finding container 749d2f88571d78f6826ae0ffde2868d9d9dc51a459c2d096eaf95d5a4d3c1576: Status 404 returned error can't find the container with id 749d2f88571d78f6826ae0ffde2868d9d9dc51a459c2d096eaf95d5a4d3c1576 Apr 16 16:34:20.577949 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.577872 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" event={"ID":"8b08aba5-1f6a-4291-a585-fc2cba58ea19","Type":"ContainerStarted","Data":"749d2f88571d78f6826ae0ffde2868d9d9dc51a459c2d096eaf95d5a4d3c1576"} Apr 16 16:34:20.687274 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.687245 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:20.689748 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.689730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/28a92454-54f9-4ddc-99dd-5c2e7731349b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-flfxc\" (UID: \"28a92454-54f9-4ddc-99dd-5c2e7731349b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:20.705495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.705471 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:20.822968 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:20.822944 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc"] Apr 16 16:34:20.825166 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:34:20.825138 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a92454_54f9_4ddc_99dd_5c2e7731349b.slice/crio-38cd2eda39c11f5317fdbe0dfab0e0b62d01a80349be36e477d68c995a8562f2 WatchSource:0}: Error finding container 38cd2eda39c11f5317fdbe0dfab0e0b62d01a80349be36e477d68c995a8562f2: Status 404 returned error can't find the container with id 38cd2eda39c11f5317fdbe0dfab0e0b62d01a80349be36e477d68c995a8562f2 Apr 16 16:34:21.582204 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:21.582169 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" event={"ID":"28a92454-54f9-4ddc-99dd-5c2e7731349b","Type":"ContainerStarted","Data":"38cd2eda39c11f5317fdbe0dfab0e0b62d01a80349be36e477d68c995a8562f2"} Apr 16 16:34:25.598580 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:25.598546 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" event={"ID":"8b08aba5-1f6a-4291-a585-fc2cba58ea19","Type":"ContainerStarted","Data":"7157fc87f41267aa8cb06bd57c12537bc35e6f8da55a8b04f04476fdc98f5c02"} Apr 16 16:34:25.599052 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:25.598634 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:34:25.599935 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:25.599913 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" event={"ID":"28a92454-54f9-4ddc-99dd-5c2e7731349b","Type":"ContainerStarted","Data":"d815052b898066972f3adc27cde8af02f4274f96c144bfbbf453358442e39153"} Apr 16 16:34:25.600042 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:25.600012 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:25.614076 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:25.614036 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" podStartSLOduration=9.553197766 podStartE2EDuration="13.614023922s" podCreationTimestamp="2026-04-16 16:34:12 +0000 UTC" firstStartedPulling="2026-04-16 16:34:20.497559702 +0000 UTC m=+280.442069674" lastFinishedPulling="2026-04-16 16:34:24.55838586 +0000 UTC m=+284.502895830" observedRunningTime="2026-04-16 16:34:25.612376593 +0000 UTC m=+285.556886587" watchObservedRunningTime="2026-04-16 16:34:25.614023922 +0000 UTC m=+285.558533915" Apr 16 16:34:25.627305 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:25.627263 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" podStartSLOduration=9.900535615999999 podStartE2EDuration="13.627251341s" podCreationTimestamp="2026-04-16 16:34:12 +0000 UTC" firstStartedPulling="2026-04-16 16:34:20.826517324 +0000 UTC m=+280.771027298" lastFinishedPulling="2026-04-16 16:34:24.553233048 +0000 UTC m=+284.497743023" observedRunningTime="2026-04-16 16:34:25.625909808 +0000 UTC m=+285.570419802" watchObservedRunningTime="2026-04-16 16:34:25.627251341 +0000 UTC m=+285.571761334" Apr 16 16:34:33.558331 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:33.558302 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x966v" Apr 16 16:34:36.607214 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:36.607189 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-flfxc" Apr 16 16:34:38.573392 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:38.573360 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6wx7k" Apr 16 16:34:40.531982 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:40.531958 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:34:40.532381 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:40.532041 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:34:40.534641 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:40.534625 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:34:46.604749 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:34:46.604719 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-s4nd2" Apr 16 16:35:19.387285 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.387209 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-6jbld"] Apr 16 16:35:19.396407 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.396386 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:19.397276 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.397255 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6"] Apr 16 16:35:19.399325 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.399238 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-9m78q\"" Apr 16 16:35:19.399325 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.399257 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:35:19.399325 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.399305 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 16:35:19.399325 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.399318 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:35:19.400186 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.400168 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-6jbld"] Apr 16 16:35:19.400270 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.400252 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:19.402320 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.402301 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 16:35:19.402470 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.402326 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-87lk7\"" Apr 16 16:35:19.409781 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.409763 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6"] Apr 16 16:35:19.426713 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.426694 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-r4lx9"] Apr 16 16:35:19.429773 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.429760 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:19.431729 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.431710 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:35:19.431818 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.431719 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-x4ndd\"" Apr 16 16:35:19.438984 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.438963 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-r4lx9"] Apr 16 16:35:19.450602 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.450582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert\") pod \"kserve-controller-manager-55c74f6fbc-6jbld\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:19.450689 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.450623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdknh\" (UniqueName: \"kubernetes.io/projected/3c90a446-1863-4730-99e4-944444c75b47-kube-api-access-tdknh\") pod \"kserve-controller-manager-55c74f6fbc-6jbld\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:19.450689 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.450646 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk9nn\" (UniqueName: \"kubernetes.io/projected/e8189e32-a5ea-4e94-8c62-9860c889a1c3-kube-api-access-gk9nn\") pod \"llmisvc-controller-manager-68cc5db7c4-qtbx6\" (UID: \"e8189e32-a5ea-4e94-8c62-9860c889a1c3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:19.450809 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.450756 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8189e32-a5ea-4e94-8c62-9860c889a1c3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qtbx6\" (UID: \"e8189e32-a5ea-4e94-8c62-9860c889a1c3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:19.551389 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.551360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert\") pod \"kserve-controller-manager-55c74f6fbc-6jbld\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:19.551389 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.551399 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdknh\" (UniqueName: \"kubernetes.io/projected/3c90a446-1863-4730-99e4-944444c75b47-kube-api-access-tdknh\") pod \"kserve-controller-manager-55c74f6fbc-6jbld\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:19.551643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.551420 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk9nn\" (UniqueName: \"kubernetes.io/projected/e8189e32-a5ea-4e94-8c62-9860c889a1c3-kube-api-access-gk9nn\") pod \"llmisvc-controller-manager-68cc5db7c4-qtbx6\" (UID: \"e8189e32-a5ea-4e94-8c62-9860c889a1c3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:19.551643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.551491 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8189e32-a5ea-4e94-8c62-9860c889a1c3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qtbx6\" (UID: \"e8189e32-a5ea-4e94-8c62-9860c889a1c3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:19.551643 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:35:19.551536 2569 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 16:35:19.551643 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:35:19.551600 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert podName:3c90a446-1863-4730-99e4-944444c75b47 nodeName:}" failed. No retries permitted until 2026-04-16 16:35:20.051578985 +0000 UTC m=+339.996088956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert") pod "kserve-controller-manager-55c74f6fbc-6jbld" (UID: "3c90a446-1863-4730-99e4-944444c75b47") : secret "kserve-webhook-server-cert" not found Apr 16 16:35:19.551643 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:35:19.551619 2569 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 16:35:19.551643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.551542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk44k\" (UniqueName: \"kubernetes.io/projected/a95be73d-0f4e-4f19-96a7-489ec171e9c6-kube-api-access-tk44k\") pod \"seaweedfs-86cc847c5c-r4lx9\" (UID: \"a95be73d-0f4e-4f19-96a7-489ec171e9c6\") " pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:19.551860 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:35:19.551667 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8189e32-a5ea-4e94-8c62-9860c889a1c3-cert podName:e8189e32-a5ea-4e94-8c62-9860c889a1c3 nodeName:}" failed. No retries permitted until 2026-04-16 16:35:20.051650289 +0000 UTC m=+339.996160269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8189e32-a5ea-4e94-8c62-9860c889a1c3-cert") pod "llmisvc-controller-manager-68cc5db7c4-qtbx6" (UID: "e8189e32-a5ea-4e94-8c62-9860c889a1c3") : secret "llmisvc-webhook-server-cert" not found Apr 16 16:35:19.551860 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.551689 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a95be73d-0f4e-4f19-96a7-489ec171e9c6-data\") pod \"seaweedfs-86cc847c5c-r4lx9\" (UID: \"a95be73d-0f4e-4f19-96a7-489ec171e9c6\") " pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:19.562270 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.562234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk9nn\" (UniqueName: \"kubernetes.io/projected/e8189e32-a5ea-4e94-8c62-9860c889a1c3-kube-api-access-gk9nn\") pod \"llmisvc-controller-manager-68cc5db7c4-qtbx6\" (UID: \"e8189e32-a5ea-4e94-8c62-9860c889a1c3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:19.562371 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.562335 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdknh\" (UniqueName: \"kubernetes.io/projected/3c90a446-1863-4730-99e4-944444c75b47-kube-api-access-tdknh\") pod \"kserve-controller-manager-55c74f6fbc-6jbld\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:19.652083 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.652055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk44k\" (UniqueName: \"kubernetes.io/projected/a95be73d-0f4e-4f19-96a7-489ec171e9c6-kube-api-access-tk44k\") pod \"seaweedfs-86cc847c5c-r4lx9\" (UID: \"a95be73d-0f4e-4f19-96a7-489ec171e9c6\") " pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:19.652083 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.652085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a95be73d-0f4e-4f19-96a7-489ec171e9c6-data\") pod \"seaweedfs-86cc847c5c-r4lx9\" (UID: \"a95be73d-0f4e-4f19-96a7-489ec171e9c6\") " pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:19.652503 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.652483 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a95be73d-0f4e-4f19-96a7-489ec171e9c6-data\") pod \"seaweedfs-86cc847c5c-r4lx9\" (UID: \"a95be73d-0f4e-4f19-96a7-489ec171e9c6\") " pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:19.659947 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.659921 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk44k\" (UniqueName: \"kubernetes.io/projected/a95be73d-0f4e-4f19-96a7-489ec171e9c6-kube-api-access-tk44k\") pod \"seaweedfs-86cc847c5c-r4lx9\" (UID: \"a95be73d-0f4e-4f19-96a7-489ec171e9c6\") " pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:19.739235 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.739211 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:19.854242 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.854218 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-r4lx9"] Apr 16 16:35:19.855939 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:35:19.855912 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda95be73d_0f4e_4f19_96a7_489ec171e9c6.slice/crio-dc78e80f7bf2822dd6c6f444178bee2cd383ef70b06fbeec0053066802745a73 WatchSource:0}: Error finding container dc78e80f7bf2822dd6c6f444178bee2cd383ef70b06fbeec0053066802745a73: Status 404 returned error can't find the container with id dc78e80f7bf2822dd6c6f444178bee2cd383ef70b06fbeec0053066802745a73 Apr 16 16:35:19.857109 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:19.857094 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:35:20.055388 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.055294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert\") pod \"kserve-controller-manager-55c74f6fbc-6jbld\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:20.055388 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.055385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8189e32-a5ea-4e94-8c62-9860c889a1c3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qtbx6\" (UID: \"e8189e32-a5ea-4e94-8c62-9860c889a1c3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:20.057619 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.057595 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8189e32-a5ea-4e94-8c62-9860c889a1c3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qtbx6\" (UID: \"e8189e32-a5ea-4e94-8c62-9860c889a1c3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:20.057619 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.057612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert\") pod \"kserve-controller-manager-55c74f6fbc-6jbld\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:20.308323 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.308227 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:20.314708 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.314689 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:20.461505 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.461458 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6"] Apr 16 16:35:20.466615 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:35:20.466582 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode8189e32_a5ea_4e94_8c62_9860c889a1c3.slice/crio-66115e1f04a82a0867bf7bc64e2339ebc33553c798a110e6ad3d0610c7d79800 WatchSource:0}: Error finding container 66115e1f04a82a0867bf7bc64e2339ebc33553c798a110e6ad3d0610c7d79800: Status 404 returned error can't find the container with id 66115e1f04a82a0867bf7bc64e2339ebc33553c798a110e6ad3d0610c7d79800 Apr 16 16:35:20.479780 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.479757 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-6jbld"] Apr 16 16:35:20.482597 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:35:20.482574 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c90a446_1863_4730_99e4_944444c75b47.slice/crio-d1ffbdacc3a229905ad627cf70d682d39aaa4f5095597b20282a5e017ab8ba93 WatchSource:0}: Error finding container d1ffbdacc3a229905ad627cf70d682d39aaa4f5095597b20282a5e017ab8ba93: Status 404 returned error can't find the container with id d1ffbdacc3a229905ad627cf70d682d39aaa4f5095597b20282a5e017ab8ba93 Apr 16 16:35:20.766663 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.766629 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-r4lx9" event={"ID":"a95be73d-0f4e-4f19-96a7-489ec171e9c6","Type":"ContainerStarted","Data":"dc78e80f7bf2822dd6c6f444178bee2cd383ef70b06fbeec0053066802745a73"} Apr 16 16:35:20.767774 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.767744 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" event={"ID":"3c90a446-1863-4730-99e4-944444c75b47","Type":"ContainerStarted","Data":"d1ffbdacc3a229905ad627cf70d682d39aaa4f5095597b20282a5e017ab8ba93"} Apr 16 16:35:20.768898 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:20.768869 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" event={"ID":"e8189e32-a5ea-4e94-8c62-9860c889a1c3","Type":"ContainerStarted","Data":"66115e1f04a82a0867bf7bc64e2339ebc33553c798a110e6ad3d0610c7d79800"} Apr 16 16:35:24.787070 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:24.787033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" event={"ID":"3c90a446-1863-4730-99e4-944444c75b47","Type":"ContainerStarted","Data":"ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9"} Apr 16 16:35:24.787396 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:24.787179 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:24.804365 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:24.804318 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" podStartSLOduration=1.60356155 podStartE2EDuration="5.80430129s" podCreationTimestamp="2026-04-16 16:35:19 +0000 UTC" firstStartedPulling="2026-04-16 16:35:20.48396608 +0000 UTC m=+340.428476052" lastFinishedPulling="2026-04-16 16:35:24.68470582 +0000 UTC m=+344.629215792" observedRunningTime="2026-04-16 16:35:24.803294614 +0000 UTC m=+344.747804609" watchObservedRunningTime="2026-04-16 16:35:24.80430129 +0000 UTC m=+344.748811284" Apr 16 16:35:25.791150 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:25.791116 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" event={"ID":"e8189e32-a5ea-4e94-8c62-9860c889a1c3","Type":"ContainerStarted","Data":"69ed5e6af95b2ad7eb56986ee00aa5cbca85219481571c4f2ec745cb4955512d"} Apr 16 16:35:25.791588 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:25.791214 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:25.792492 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:25.792469 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-r4lx9" event={"ID":"a95be73d-0f4e-4f19-96a7-489ec171e9c6","Type":"ContainerStarted","Data":"03892f5030b23cad2a6f09e03dae83484ec7c326fc8c1c629a29f375aea7a349"} Apr 16 16:35:25.792597 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:25.792560 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:25.807646 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:25.807609 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" podStartSLOduration=2.57869668 podStartE2EDuration="6.807598364s" podCreationTimestamp="2026-04-16 16:35:19 +0000 UTC" firstStartedPulling="2026-04-16 16:35:20.468243151 +0000 UTC m=+340.412753123" lastFinishedPulling="2026-04-16 16:35:24.697144832 +0000 UTC m=+344.641654807" observedRunningTime="2026-04-16 16:35:25.80556847 +0000 UTC m=+345.750078474" watchObservedRunningTime="2026-04-16 16:35:25.807598364 +0000 UTC m=+345.752108356" Apr 16 16:35:25.824733 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:25.824692 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-r4lx9" podStartSLOduration=1.941935068 podStartE2EDuration="6.824681401s" podCreationTimestamp="2026-04-16 16:35:19 +0000 UTC" firstStartedPulling="2026-04-16 16:35:19.85721174 +0000 UTC m=+339.801721711" lastFinishedPulling="2026-04-16 16:35:24.739958065 +0000 UTC m=+344.684468044" observedRunningTime="2026-04-16 16:35:25.82372178 +0000 UTC m=+345.768231788" watchObservedRunningTime="2026-04-16 16:35:25.824681401 +0000 UTC m=+345.769191394" Apr 16 16:35:31.799942 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:31.799910 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-r4lx9" Apr 16 16:35:55.798122 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:55.798092 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:56.800137 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:56.800109 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qtbx6" Apr 16 16:35:58.017652 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.017618 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-6jbld"] Apr 16 16:35:58.018046 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.017868 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" podUID="3c90a446-1863-4730-99e4-944444c75b47" containerName="manager" containerID="cri-o://ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9" gracePeriod=10 Apr 16 16:35:58.039863 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.039838 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-4rtw4"] Apr 16 16:35:58.112336 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.112316 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-4rtw4"] Apr 16 16:35:58.112512 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.112495 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:58.173397 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.173363 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be948ed2-d002-499c-8906-e9d38ed04dc0-cert\") pod \"kserve-controller-manager-55c74f6fbc-4rtw4\" (UID: \"be948ed2-d002-499c-8906-e9d38ed04dc0\") " pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:58.173541 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.173481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfst2\" (UniqueName: \"kubernetes.io/projected/be948ed2-d002-499c-8906-e9d38ed04dc0-kube-api-access-qfst2\") pod \"kserve-controller-manager-55c74f6fbc-4rtw4\" (UID: \"be948ed2-d002-499c-8906-e9d38ed04dc0\") " pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:58.274014 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.273953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfst2\" (UniqueName: \"kubernetes.io/projected/be948ed2-d002-499c-8906-e9d38ed04dc0-kube-api-access-qfst2\") pod \"kserve-controller-manager-55c74f6fbc-4rtw4\" (UID: \"be948ed2-d002-499c-8906-e9d38ed04dc0\") " pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:58.274014 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.274000 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be948ed2-d002-499c-8906-e9d38ed04dc0-cert\") pod \"kserve-controller-manager-55c74f6fbc-4rtw4\" (UID: \"be948ed2-d002-499c-8906-e9d38ed04dc0\") " pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:58.276251 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.276233 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be948ed2-d002-499c-8906-e9d38ed04dc0-cert\") pod \"kserve-controller-manager-55c74f6fbc-4rtw4\" (UID: \"be948ed2-d002-499c-8906-e9d38ed04dc0\") " pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:58.281666 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.281645 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfst2\" (UniqueName: \"kubernetes.io/projected/be948ed2-d002-499c-8906-e9d38ed04dc0-kube-api-access-qfst2\") pod \"kserve-controller-manager-55c74f6fbc-4rtw4\" (UID: \"be948ed2-d002-499c-8906-e9d38ed04dc0\") " pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:58.290192 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.290174 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:58.374850 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.374824 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdknh\" (UniqueName: \"kubernetes.io/projected/3c90a446-1863-4730-99e4-944444c75b47-kube-api-access-tdknh\") pod \"3c90a446-1863-4730-99e4-944444c75b47\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " Apr 16 16:35:58.375019 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.374860 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert\") pod \"3c90a446-1863-4730-99e4-944444c75b47\" (UID: \"3c90a446-1863-4730-99e4-944444c75b47\") " Apr 16 16:35:58.376856 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.376830 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert" (OuterVolumeSpecName: "cert") pod "3c90a446-1863-4730-99e4-944444c75b47" (UID: "3c90a446-1863-4730-99e4-944444c75b47"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:35:58.376954 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.376866 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c90a446-1863-4730-99e4-944444c75b47-kube-api-access-tdknh" (OuterVolumeSpecName: "kube-api-access-tdknh") pod "3c90a446-1863-4730-99e4-944444c75b47" (UID: "3c90a446-1863-4730-99e4-944444c75b47"). InnerVolumeSpecName "kube-api-access-tdknh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:35:58.476149 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.476117 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdknh\" (UniqueName: \"kubernetes.io/projected/3c90a446-1863-4730-99e4-944444c75b47-kube-api-access-tdknh\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:35:58.476149 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.476145 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c90a446-1863-4730-99e4-944444c75b47-cert\") on node \"ip-10-0-132-191.ec2.internal\" DevicePath \"\"" Apr 16 16:35:58.478084 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.478058 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:58.597104 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.597072 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-4rtw4"] Apr 16 16:35:58.600501 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:35:58.600474 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe948ed2_d002_499c_8906_e9d38ed04dc0.slice/crio-9934910043aeb0aa073c01538fdb832378d0e8cdacaa504481f774cc2a5c0b3c WatchSource:0}: Error finding container 9934910043aeb0aa073c01538fdb832378d0e8cdacaa504481f774cc2a5c0b3c: Status 404 returned error can't find the container with id 9934910043aeb0aa073c01538fdb832378d0e8cdacaa504481f774cc2a5c0b3c Apr 16 16:35:58.899539 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.899506 2569 generic.go:358] "Generic (PLEG): container finished" podID="3c90a446-1863-4730-99e4-944444c75b47" containerID="ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9" exitCode=0 Apr 16 16:35:58.899704 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.899569 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" Apr 16 16:35:58.899704 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.899573 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" event={"ID":"3c90a446-1863-4730-99e4-944444c75b47","Type":"ContainerDied","Data":"ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9"} Apr 16 16:35:58.899704 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.899601 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-6jbld" event={"ID":"3c90a446-1863-4730-99e4-944444c75b47","Type":"ContainerDied","Data":"d1ffbdacc3a229905ad627cf70d682d39aaa4f5095597b20282a5e017ab8ba93"} Apr 16 16:35:58.899704 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.899623 2569 scope.go:117] "RemoveContainer" containerID="ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9" Apr 16 16:35:58.900643 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.900623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" event={"ID":"be948ed2-d002-499c-8906-e9d38ed04dc0","Type":"ContainerStarted","Data":"9934910043aeb0aa073c01538fdb832378d0e8cdacaa504481f774cc2a5c0b3c"} Apr 16 16:35:58.907490 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.907473 2569 scope.go:117] "RemoveContainer" containerID="ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9" Apr 16 16:35:58.907742 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:35:58.907723 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9\": container with ID starting with ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9 not found: ID does not exist" containerID="ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9" Apr 16 16:35:58.907788 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.907752 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9"} err="failed to get container status \"ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9\": rpc error: code = NotFound desc = could not find container \"ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9\": container with ID starting with ef7e9018dfb2d9382c325e1218801393cbfef40fca577229801e01db3cb3dfe9 not found: ID does not exist" Apr 16 16:35:58.913851 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.913832 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-6jbld"] Apr 16 16:35:58.917456 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:58.917423 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-6jbld"] Apr 16 16:35:59.905756 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:59.905723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" event={"ID":"be948ed2-d002-499c-8906-e9d38ed04dc0","Type":"ContainerStarted","Data":"3311b8b0a8fd1fc980702f703dcafe483930cd591c10ee8a88ce0bedff36ea35"} Apr 16 16:35:59.906248 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:59.905857 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:35:59.924595 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:35:59.924540 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" podStartSLOduration=1.3765113900000001 podStartE2EDuration="1.924521748s" podCreationTimestamp="2026-04-16 16:35:58 +0000 UTC" firstStartedPulling="2026-04-16 16:35:58.602333138 +0000 UTC m=+378.546843124" lastFinishedPulling="2026-04-16 16:35:59.150343511 +0000 UTC m=+379.094853482" observedRunningTime="2026-04-16 16:35:59.92364864 +0000 UTC m=+379.868158633" watchObservedRunningTime="2026-04-16 16:35:59.924521748 +0000 UTC m=+379.869031741" Apr 16 16:36:00.607180 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:00.607147 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c90a446-1863-4730-99e4-944444c75b47" path="/var/lib/kubelet/pods/3c90a446-1863-4730-99e4-944444c75b47/volumes" Apr 16 16:36:30.914819 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:30.914786 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-55c74f6fbc-4rtw4" Apr 16 16:36:31.837678 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.837646 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-9vbwp"] Apr 16 16:36:31.838006 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.837994 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c90a446-1863-4730-99e4-944444c75b47" containerName="manager" Apr 16 16:36:31.838050 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.838007 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c90a446-1863-4730-99e4-944444c75b47" containerName="manager" Apr 16 16:36:31.838084 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.838068 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c90a446-1863-4730-99e4-944444c75b47" containerName="manager" Apr 16 16:36:31.841196 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.841175 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:31.843639 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.843622 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 16:36:31.843758 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.843675 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-qv4km\"" Apr 16 16:36:31.848783 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.848760 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-9vbwp"] Apr 16 16:36:31.937035 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.936997 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t877q\" (UniqueName: \"kubernetes.io/projected/2a3fb6e0-bced-4d21-bc53-7131e6e85616-kube-api-access-t877q\") pod \"odh-model-controller-696fc77849-9vbwp\" (UID: \"2a3fb6e0-bced-4d21-bc53-7131e6e85616\") " pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:31.937035 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:31.937039 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a3fb6e0-bced-4d21-bc53-7131e6e85616-cert\") pod \"odh-model-controller-696fc77849-9vbwp\" (UID: \"2a3fb6e0-bced-4d21-bc53-7131e6e85616\") " pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:32.037901 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:32.037861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t877q\" (UniqueName: \"kubernetes.io/projected/2a3fb6e0-bced-4d21-bc53-7131e6e85616-kube-api-access-t877q\") pod \"odh-model-controller-696fc77849-9vbwp\" (UID: \"2a3fb6e0-bced-4d21-bc53-7131e6e85616\") " pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:32.037901 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:32.037902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a3fb6e0-bced-4d21-bc53-7131e6e85616-cert\") pod \"odh-model-controller-696fc77849-9vbwp\" (UID: \"2a3fb6e0-bced-4d21-bc53-7131e6e85616\") " pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:32.038102 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:36:32.037997 2569 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 16:36:32.038102 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:36:32.038058 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a3fb6e0-bced-4d21-bc53-7131e6e85616-cert podName:2a3fb6e0-bced-4d21-bc53-7131e6e85616 nodeName:}" failed. No retries permitted until 2026-04-16 16:36:32.538041273 +0000 UTC m=+412.482551244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a3fb6e0-bced-4d21-bc53-7131e6e85616-cert") pod "odh-model-controller-696fc77849-9vbwp" (UID: "2a3fb6e0-bced-4d21-bc53-7131e6e85616") : secret "odh-model-controller-webhook-cert" not found Apr 16 16:36:32.046794 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:32.046766 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t877q\" (UniqueName: \"kubernetes.io/projected/2a3fb6e0-bced-4d21-bc53-7131e6e85616-kube-api-access-t877q\") pod \"odh-model-controller-696fc77849-9vbwp\" (UID: \"2a3fb6e0-bced-4d21-bc53-7131e6e85616\") " pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:32.542614 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:32.542573 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a3fb6e0-bced-4d21-bc53-7131e6e85616-cert\") pod \"odh-model-controller-696fc77849-9vbwp\" (UID: \"2a3fb6e0-bced-4d21-bc53-7131e6e85616\") " pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:32.545069 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:32.545046 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a3fb6e0-bced-4d21-bc53-7131e6e85616-cert\") pod \"odh-model-controller-696fc77849-9vbwp\" (UID: \"2a3fb6e0-bced-4d21-bc53-7131e6e85616\") " pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:32.752585 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:32.752548 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:32.890019 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:32.889997 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-9vbwp"] Apr 16 16:36:32.892625 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:36:32.892600 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a3fb6e0_bced_4d21_bc53_7131e6e85616.slice/crio-4aef64f4b2a0a3e22bea67825af61ee0cb91bb4d0ec83609ef1e9630e81bf182 WatchSource:0}: Error finding container 4aef64f4b2a0a3e22bea67825af61ee0cb91bb4d0ec83609ef1e9630e81bf182: Status 404 returned error can't find the container with id 4aef64f4b2a0a3e22bea67825af61ee0cb91bb4d0ec83609ef1e9630e81bf182 Apr 16 16:36:33.015325 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:33.015291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-9vbwp" event={"ID":"2a3fb6e0-bced-4d21-bc53-7131e6e85616","Type":"ContainerStarted","Data":"4aef64f4b2a0a3e22bea67825af61ee0cb91bb4d0ec83609ef1e9630e81bf182"} Apr 16 16:36:36.028279 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:36.028224 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-9vbwp" event={"ID":"2a3fb6e0-bced-4d21-bc53-7131e6e85616","Type":"ContainerStarted","Data":"a424273c6c5e154f79d214bdc3ab5a78e0db07a8b9c4ec2f9b9a539a3e5957e6"} Apr 16 16:36:36.028856 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:36.028350 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:36:36.044353 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:36.044307 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-9vbwp" podStartSLOduration=2.45050544 podStartE2EDuration="5.044294223s" podCreationTimestamp="2026-04-16 16:36:31 +0000 UTC" firstStartedPulling="2026-04-16 16:36:32.8939783 +0000 UTC m=+412.838488275" lastFinishedPulling="2026-04-16 16:36:35.487767088 +0000 UTC m=+415.432277058" observedRunningTime="2026-04-16 16:36:36.042152858 +0000 UTC m=+415.986662852" watchObservedRunningTime="2026-04-16 16:36:36.044294223 +0000 UTC m=+415.988804216" Apr 16 16:36:47.034522 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:36:47.034491 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-9vbwp" Apr 16 16:37:15.294223 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.294149 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2"] Apr 16 16:37:15.298772 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.298752 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.300731 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.300710 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 16:37:15.300836 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.300712 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 16:37:15.303743 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.303720 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2"] Apr 16 16:37:15.395104 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.395066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/138f9e26-20b4-449a-b6b7-67cfb8024e37-data\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.395287 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.395112 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/138f9e26-20b4-449a-b6b7-67cfb8024e37-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.395287 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.395133 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhzp\" (UniqueName: \"kubernetes.io/projected/138f9e26-20b4-449a-b6b7-67cfb8024e37-kube-api-access-dwhzp\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.495786 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.495749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/138f9e26-20b4-449a-b6b7-67cfb8024e37-data\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.495786 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.495793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/138f9e26-20b4-449a-b6b7-67cfb8024e37-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.496055 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.495812 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhzp\" (UniqueName: \"kubernetes.io/projected/138f9e26-20b4-449a-b6b7-67cfb8024e37-kube-api-access-dwhzp\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.496228 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.496205 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/138f9e26-20b4-449a-b6b7-67cfb8024e37-data\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.498154 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.498131 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/138f9e26-20b4-449a-b6b7-67cfb8024e37-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.503087 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.503061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhzp\" (UniqueName: \"kubernetes.io/projected/138f9e26-20b4-449a-b6b7-67cfb8024e37-kube-api-access-dwhzp\") pod \"seaweedfs-tls-serving-7fd5766db9-r9zd2\" (UID: \"138f9e26-20b4-449a-b6b7-67cfb8024e37\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.608586 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.608512 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" Apr 16 16:37:15.722237 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:15.722204 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2"] Apr 16 16:37:15.725952 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:37:15.725922 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod138f9e26_20b4_449a_b6b7_67cfb8024e37.slice/crio-1edb10404effdabb1b4acaac427749033052cba3578d76d5be70b99148d7c0a1 WatchSource:0}: Error finding container 1edb10404effdabb1b4acaac427749033052cba3578d76d5be70b99148d7c0a1: Status 404 returned error can't find the container with id 1edb10404effdabb1b4acaac427749033052cba3578d76d5be70b99148d7c0a1 Apr 16 16:37:16.157347 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:16.157313 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" event={"ID":"138f9e26-20b4-449a-b6b7-67cfb8024e37","Type":"ContainerStarted","Data":"a362a6adee539f3e4b75708f1aaa41ed0bf6892ec816b4b4bc629aff2aa93c72"} Apr 16 16:37:16.157347 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:16.157348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" event={"ID":"138f9e26-20b4-449a-b6b7-67cfb8024e37","Type":"ContainerStarted","Data":"1edb10404effdabb1b4acaac427749033052cba3578d76d5be70b99148d7c0a1"} Apr 16 16:37:16.171807 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:37:16.171762 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-r9zd2" podStartSLOduration=0.875554141 podStartE2EDuration="1.171748232s" podCreationTimestamp="2026-04-16 16:37:15 +0000 UTC" firstStartedPulling="2026-04-16 16:37:15.72718194 +0000 UTC m=+455.671691914" lastFinishedPulling="2026-04-16 16:37:16.023376034 +0000 UTC m=+455.967886005" observedRunningTime="2026-04-16 16:37:16.171695929 +0000 UTC m=+456.116205922" watchObservedRunningTime="2026-04-16 16:37:16.171748232 +0000 UTC m=+456.116258226" Apr 16 16:39:40.554547 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:39:40.554520 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:39:40.555796 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:39:40.555776 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:40:23.555616 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:23.555539 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj"] Apr 16 16:40:23.557917 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:23.557895 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" Apr 16 16:40:23.559961 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:23.559942 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-fhlr7\"" Apr 16 16:40:23.565541 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:23.565517 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj"] Apr 16 16:40:23.568311 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:23.568296 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" Apr 16 16:40:23.691129 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:23.691096 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj"] Apr 16 16:40:23.694751 ip-10-0-132-191 kubenswrapper[2569]: W0416 16:40:23.694726 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61a3dcb_acb5_4c4d_a817_b767f7cb4aeb.slice/crio-0f3935102ce16e362b21a17f21c697d3dd16c8efca9ba9065c2a29df2d51291f WatchSource:0}: Error finding container 0f3935102ce16e362b21a17f21c697d3dd16c8efca9ba9065c2a29df2d51291f: Status 404 returned error can't find the container with id 0f3935102ce16e362b21a17f21c697d3dd16c8efca9ba9065c2a29df2d51291f Apr 16 16:40:23.696551 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:23.696535 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:40:23.769888 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:23.769856 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" event={"ID":"b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb","Type":"ContainerStarted","Data":"0f3935102ce16e362b21a17f21c697d3dd16c8efca9ba9065c2a29df2d51291f"} Apr 16 16:40:24.774074 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:24.774043 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" event={"ID":"b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb","Type":"ContainerStarted","Data":"548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6"} Apr 16 16:40:24.774426 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:24.774273 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" Apr 16 16:40:24.775946 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:24.775919 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" Apr 16 16:40:24.786920 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:40:24.786880 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" podStartSLOduration=0.930471242 podStartE2EDuration="1.786866862s" podCreationTimestamp="2026-04-16 16:40:23 +0000 UTC" firstStartedPulling="2026-04-16 16:40:23.696693917 +0000 UTC m=+643.641203892" lastFinishedPulling="2026-04-16 16:40:24.553089535 +0000 UTC m=+644.497599512" observedRunningTime="2026-04-16 16:40:24.785927563 +0000 UTC m=+644.730437556" watchObservedRunningTime="2026-04-16 16:40:24.786866862 +0000 UTC m=+644.731376856" Apr 16 16:41:48.655078 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:48.655050 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-xqlpj_b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb/kserve-container/0.log" Apr 16 16:41:48.943221 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:48.943128 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj"] Apr 16 16:41:48.943422 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:48.943399 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" podUID="b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb" containerName="kserve-container" containerID="cri-o://548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6" gracePeriod=30 Apr 16 16:41:49.184574 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:49.184554 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" Apr 16 16:41:50.052870 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.052835 2569 generic.go:358] "Generic (PLEG): container finished" podID="b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb" containerID="548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6" exitCode=2 Apr 16 16:41:50.053332 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.052902 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" Apr 16 16:41:50.053332 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.052923 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" event={"ID":"b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb","Type":"ContainerDied","Data":"548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6"} Apr 16 16:41:50.053332 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.052961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj" event={"ID":"b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb","Type":"ContainerDied","Data":"0f3935102ce16e362b21a17f21c697d3dd16c8efca9ba9065c2a29df2d51291f"} Apr 16 16:41:50.053332 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.052976 2569 scope.go:117] "RemoveContainer" containerID="548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6" Apr 16 16:41:50.061156 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.061140 2569 scope.go:117] "RemoveContainer" containerID="548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6" Apr 16 16:41:50.061392 ip-10-0-132-191 kubenswrapper[2569]: E0416 16:41:50.061370 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6\": container with ID starting with 548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6 not found: ID does not exist" containerID="548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6" Apr 16 16:41:50.061495 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.061399 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6"} err="failed to get container status \"548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6\": rpc error: code = NotFound desc = could not find container \"548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6\": container with ID starting with 548159bc6c6f4c46b00ebe4c4f15cf04b624373445e9695810762bc13ba870a6 not found: ID does not exist" Apr 16 16:41:50.071870 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.071850 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj"] Apr 16 16:41:50.076688 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.076659 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xqlpj"] Apr 16 16:41:50.604589 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:41:50.604555 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb" path="/var/lib/kubelet/pods/b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb/volumes" Apr 16 16:44:40.576959 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:44:40.576882 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:44:40.579282 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:44:40.579263 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:49:40.598829 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:49:40.598805 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:49:40.602763 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:49:40.602740 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:54:40.621113 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:54:40.621080 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:54:40.625210 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:54:40.625193 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:59:40.642785 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:59:40.642754 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 16:59:40.646882 ip-10-0-132-191 kubenswrapper[2569]: I0416 16:59:40.646864 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:04:40.665310 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:04:40.665281 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:04:40.671559 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:04:40.671538 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:09:40.686683 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:09:40.686658 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:09:40.693309 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:09:40.693289 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:14:40.713319 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:14:40.713208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:14:40.719140 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:14:40.719119 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:19:40.740012 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:19:40.739888 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:19:40.746616 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:19:40.746599 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:24:40.761449 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:24:40.761319 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:24:40.767771 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:24:40.767755 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:27:59.733450 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:27:59.733353 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nv9gc_bc1da351-41a0-434d-9b7e-bf1cfdc791f4/global-pull-secret-syncer/0.log" Apr 16 17:27:59.914201 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:27:59.914174 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-smsng_d458bdca-23bd-4bb6-b0ec-a3050b306786/konnectivity-agent/0.log" Apr 16 17:27:59.934300 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:27:59.934267 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-191.ec2.internal_cbb5555364131c93c767ef634af82b6a/haproxy/0.log" Apr 16 17:28:02.791467 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:02.791423 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1/alertmanager/0.log" Apr 16 17:28:02.820376 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:02.820351 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1/config-reloader/0.log" Apr 16 17:28:02.841183 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:02.841158 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1/kube-rbac-proxy-web/0.log" Apr 16 17:28:02.871680 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:02.871656 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1/kube-rbac-proxy/0.log" Apr 16 17:28:02.899544 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:02.899516 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1/kube-rbac-proxy-metric/0.log" Apr 16 17:28:02.925633 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:02.925608 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1/prom-label-proxy/0.log" Apr 16 17:28:02.950481 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:02.950457 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5e9cdb47-3a95-4f8c-8b36-6dc0ee150dd1/init-config-reloader/0.log" Apr 16 17:28:03.035572 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.035544 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6hznm_ed9c32a6-1b36-4e37-9070-e1fc11116efa/kube-state-metrics/0.log" Apr 16 17:28:03.065388 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.065310 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6hznm_ed9c32a6-1b36-4e37-9070-e1fc11116efa/kube-rbac-proxy-main/0.log" Apr 16 17:28:03.095055 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.095033 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6hznm_ed9c32a6-1b36-4e37-9070-e1fc11116efa/kube-rbac-proxy-self/0.log" Apr 16 17:28:03.138010 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.137987 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-84986867f4-jqlsh_5210c885-bbff-4603-a4c8-b48a036b3f53/metrics-server/0.log" Apr 16 17:28:03.161038 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.161004 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-hlljz_b57d10b7-23ab-4f4d-9d24-c0e8d2ec879a/monitoring-plugin/0.log" Apr 16 17:28:03.277763 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.277736 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kznk_8ac68bd4-1081-4135-b7ed-90d2c1e552d7/node-exporter/0.log" Apr 16 17:28:03.297949 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.297916 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kznk_8ac68bd4-1081-4135-b7ed-90d2c1e552d7/kube-rbac-proxy/0.log" Apr 16 17:28:03.317372 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.317313 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kznk_8ac68bd4-1081-4135-b7ed-90d2c1e552d7/init-textfile/0.log" Apr 16 17:28:03.749389 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.749356 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d8f8c7b85-d62nd_26950289-2205-4f21-8af0-9d60b932ac3d/telemeter-client/0.log" Apr 16 17:28:03.772465 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.772426 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d8f8c7b85-d62nd_26950289-2205-4f21-8af0-9d60b932ac3d/reload/0.log" Apr 16 17:28:03.791563 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:03.791534 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d8f8c7b85-d62nd_26950289-2205-4f21-8af0-9d60b932ac3d/kube-rbac-proxy/0.log" Apr 16 17:28:07.027855 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.027826 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bw5bz_d9c9b578-b92d-41d0-8f31-a11cc6862b71/dns/0.log" Apr 16 17:28:07.046244 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.046215 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bw5bz_d9c9b578-b92d-41d0-8f31-a11cc6862b71/kube-rbac-proxy/0.log" Apr 16 17:28:07.195215 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.195190 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zcdxw_d2c2bb39-a1e3-4a2f-b48b-4c18cd10ce8c/dns-node-resolver/0.log" Apr 16 17:28:07.201607 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.201583 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl"] Apr 16 17:28:07.201933 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.201914 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb" containerName="kserve-container" Apr 16 17:28:07.202088 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.202069 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb" containerName="kserve-container" Apr 16 17:28:07.202187 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.202182 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b61a3dcb-acb5-4c4d-a817-b767f7cb4aeb" containerName="kserve-container" Apr 16 17:28:07.205203 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.205189 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.207461 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.207442 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qzsql\"/\"kube-root-ca.crt\"" Apr 16 17:28:07.208220 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.208201 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qzsql\"/\"default-dockercfg-vd8tg\"" Apr 16 17:28:07.208326 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.208244 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qzsql\"/\"openshift-service-ca.crt\"" Apr 16 17:28:07.213149 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.213130 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl"] Apr 16 17:28:07.269038 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.269011 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-proc\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.269193 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.269045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-podres\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.269193 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.269078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-lib-modules\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.269193 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.269103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-sys\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.269193 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.269132 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzrtc\" (UniqueName: \"kubernetes.io/projected/273d5c9e-a152-447c-b1be-b365d46b54bf-kube-api-access-gzrtc\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370404 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-proc\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370404 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370361 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-podres\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370404 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370388 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-lib-modules\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370404 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370407 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-sys\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370690 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzrtc\" (UniqueName: \"kubernetes.io/projected/273d5c9e-a152-447c-b1be-b365d46b54bf-kube-api-access-gzrtc\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370690 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370467 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-proc\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370690 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370527 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-podres\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370690 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370527 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-lib-modules\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.370690 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.370565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/273d5c9e-a152-447c-b1be-b365d46b54bf-sys\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.378991 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.378972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzrtc\" (UniqueName: \"kubernetes.io/projected/273d5c9e-a152-447c-b1be-b365d46b54bf-kube-api-access-gzrtc\") pod \"perf-node-gather-daemonset-dq9bl\" (UID: \"273d5c9e-a152-447c-b1be-b365d46b54bf\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.516104 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.516076 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.634176 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.634113 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-767dc6c99d-gsj9l_eead9e2d-3f5a-4b10-abe0-1af5f3458da5/registry/0.log" Apr 16 17:28:07.643709 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.643682 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl"] Apr 16 17:28:07.646262 ip-10-0-132-191 kubenswrapper[2569]: W0416 17:28:07.646235 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod273d5c9e_a152_447c_b1be_b365d46b54bf.slice/crio-a82864693d71a30508f07adda8c2b800c8749f4baf516b0ecc9a5b5b3a011310 WatchSource:0}: Error finding container a82864693d71a30508f07adda8c2b800c8749f4baf516b0ecc9a5b5b3a011310: Status 404 returned error can't find the container with id a82864693d71a30508f07adda8c2b800c8749f4baf516b0ecc9a5b5b3a011310 Apr 16 17:28:07.647870 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.647850 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:28:07.711378 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.711318 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r42bt_4240101a-1b9a-426e-bf0c-bc8b7b372154/node-ca/0.log" Apr 16 17:28:07.917554 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.917523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" event={"ID":"273d5c9e-a152-447c-b1be-b365d46b54bf","Type":"ContainerStarted","Data":"7eaaf13936c3dcd8c7ffc745de0bbd2ba04133b2a5e1b34d9a3c932d8e6cd9f7"} Apr 16 17:28:07.917714 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.917555 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" event={"ID":"273d5c9e-a152-447c-b1be-b365d46b54bf","Type":"ContainerStarted","Data":"a82864693d71a30508f07adda8c2b800c8749f4baf516b0ecc9a5b5b3a011310"} Apr 16 17:28:07.917714 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.917651 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:07.932796 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:07.932759 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" podStartSLOduration=0.932743994 podStartE2EDuration="932.743994ms" podCreationTimestamp="2026-04-16 17:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:28:07.931103213 +0000 UTC m=+3507.875613207" watchObservedRunningTime="2026-04-16 17:28:07.932743994 +0000 UTC m=+3507.877253988" Apr 16 17:28:08.702342 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:08.702304 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b447l_cb9c7b6f-6df0-4fa9-bdc9-90df14ca5958/serve-healthcheck-canary/0.log" Apr 16 17:28:09.115083 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:09.115000 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jlfdr_e53c6569-aafb-4b9a-8cd8-4e2f9772d993/kube-rbac-proxy/0.log" Apr 16 17:28:09.133542 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:09.133517 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jlfdr_e53c6569-aafb-4b9a-8cd8-4e2f9772d993/exporter/0.log" Apr 16 17:28:09.152771 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:09.152717 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jlfdr_e53c6569-aafb-4b9a-8cd8-4e2f9772d993/extractor/0.log" Apr 16 17:28:11.316372 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:11.316333 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-55c74f6fbc-4rtw4_be948ed2-d002-499c-8906-e9d38ed04dc0/manager/0.log" Apr 16 17:28:11.338814 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:11.338791 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-qtbx6_e8189e32-a5ea-4e94-8c62-9860c889a1c3/manager/0.log" Apr 16 17:28:11.730732 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:11.730700 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-9vbwp_2a3fb6e0-bced-4d21-bc53-7131e6e85616/manager/0.log" Apr 16 17:28:11.844605 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:11.844574 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-r4lx9_a95be73d-0f4e-4f19-96a7-489ec171e9c6/seaweedfs/0.log" Apr 16 17:28:11.889512 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:11.889490 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-r9zd2_138f9e26-20b4-449a-b6b7-67cfb8024e37/seaweedfs-tls-serving/0.log" Apr 16 17:28:13.930603 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:13.930572 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-dq9bl" Apr 16 17:28:17.022561 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.022535 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f86f6_1666c169-2943-4cf3-8a4a-51fc0345056b/kube-multus-additional-cni-plugins/0.log" Apr 16 17:28:17.049196 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.049170 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f86f6_1666c169-2943-4cf3-8a4a-51fc0345056b/egress-router-binary-copy/0.log" Apr 16 17:28:17.070586 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.070557 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f86f6_1666c169-2943-4cf3-8a4a-51fc0345056b/cni-plugins/0.log" Apr 16 17:28:17.092505 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.092483 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f86f6_1666c169-2943-4cf3-8a4a-51fc0345056b/bond-cni-plugin/0.log" Apr 16 17:28:17.112890 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.112873 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f86f6_1666c169-2943-4cf3-8a4a-51fc0345056b/routeoverride-cni/0.log" Apr 16 17:28:17.134526 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.134505 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f86f6_1666c169-2943-4cf3-8a4a-51fc0345056b/whereabouts-cni-bincopy/0.log" Apr 16 17:28:17.156365 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.156350 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f86f6_1666c169-2943-4cf3-8a4a-51fc0345056b/whereabouts-cni/0.log" Apr 16 17:28:17.380478 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.380385 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q5298_ca349f5f-0121-41f2-99a5-676a6e9d7f2c/kube-multus/0.log" Apr 16 17:28:17.397790 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.397768 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ff4ns_3fbad60e-9cf1-43dd-abb0-8d7c1caab371/network-metrics-daemon/0.log" Apr 16 17:28:17.415264 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:17.415245 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ff4ns_3fbad60e-9cf1-43dd-abb0-8d7c1caab371/kube-rbac-proxy/0.log" Apr 16 17:28:18.166347 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.166319 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-controller/0.log" Apr 16 17:28:18.181734 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.181708 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/0.log" Apr 16 17:28:18.215097 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.215075 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovn-acl-logging/1.log" Apr 16 17:28:18.231934 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.231914 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/kube-rbac-proxy-node/0.log" Apr 16 17:28:18.252573 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.252553 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:28:18.268878 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.268860 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/northd/0.log" Apr 16 17:28:18.286954 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.286934 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/nbdb/0.log" Apr 16 17:28:18.307032 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.307012 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/sbdb/0.log" Apr 16 17:28:18.504238 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:18.504162 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b42xz_18ea2b0b-1348-4827-969f-18c4a33a0dc8/ovnkube-controller/0.log" Apr 16 17:28:20.055498 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:20.055464 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9jq9v_f81e14b6-a4d4-417f-9556-bdceafdafe3a/network-check-target-container/0.log" Apr 16 17:28:20.912312 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:20.912283 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7xm7g_c700cbcd-8214-4f4c-b770-1c0db784bc7b/iptables-alerter/0.log" Apr 16 17:28:21.571334 ip-10-0-132-191 kubenswrapper[2569]: I0416 17:28:21.571236 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8nd6p_91b79665-6a07-4b37-bdfb-a4cd7ab285a2/tuned/0.log"