Apr 17 14:33:51.136461 ip-10-0-135-180 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:33:51.613397 ip-10-0-135-180 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:33:51.613397 ip-10-0-135-180 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:33:51.613397 ip-10-0-135-180 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:33:51.613397 ip-10-0-135-180 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:33:51.613397 ip-10-0-135-180 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:33:51.614581 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.614293 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:33:51.618289 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618271 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:51.618289 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618287 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618292 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618295 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618299 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618302 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618305 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618308 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618311 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618314 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618317 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618320 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618323 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618331 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618334 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618337 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618340 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618342 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618345 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618348 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618351 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:51.618357 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618354 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618356 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618359 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618362 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618365 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618368 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618371 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618375 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618379 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618382 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618385 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618387 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618390 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618392 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618395 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618398 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618400 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618404 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618407 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:51.618858 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618409 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618412 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618415 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618417 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618420 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618422 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618424 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618427 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618429 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618432 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618434 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618437 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618440 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618443 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618445 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618448 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618451 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618454 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618456 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618459 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:51.619328 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618462 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618464 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618466 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618469 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618472 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618474 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618480 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618483 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618486 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618488 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618491 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618493 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618496 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618499 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618504 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618510 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618514 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618519 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618522 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618525 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:51.619845 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618527 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:51.620333 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618530 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:51.620333 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618533 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:51.620333 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618535 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:51.620333 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618538 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:51.620333 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.618541 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:51.621056 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621042 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:51.621056 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621055 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621058 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621062 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621065 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621069 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621073 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621076 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621079 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621081 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621084 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621087 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621090 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621093 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621095 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621098 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621101 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621103 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621106 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621108 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621111 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:51.621116 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621113 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621116 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621119 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621122 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621124 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621127 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621130 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621133 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621136 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621138 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621141 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621144 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621147 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621160 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621163 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621165 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621168 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621171 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621173 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621176 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:51.621724 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621178 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621183 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621187 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621191 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621194 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621197 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621200 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621203 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621205 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621208 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621210 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621213 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621215 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621218 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621221 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621224 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621227 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621230 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621232 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:51.622232 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621235 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621238 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621240 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621243 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621245 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621248 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621251 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621253 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621256 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621259 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621261 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621264 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621266 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621269 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621271 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621274 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621277 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621279 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621283 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621286 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:51.622704 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621288 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621291 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621293 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621296 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621298 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.621301 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621378 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621398 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621409 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621416 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621423 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621427 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621431 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621443 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621447 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621450 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621454 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621458 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621461 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621464 2577 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621467 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621470 2577 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621473 2577 flags.go:64] FLAG: --cloud-config="" Apr 17 14:33:51.623260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621476 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621479 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621484 2577 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621486 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621490 2577 flags.go:64] FLAG: --config-dir="" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621493 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621496 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621500 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621505 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621508 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621511 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621514 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621517 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621521 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621524 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621527 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621535 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621539 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621542 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621545 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621548 2577 flags.go:64] FLAG: --enable-server="true" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621552 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621557 2577 flags.go:64] FLAG: --event-burst="100" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621560 2577 flags.go:64] FLAG: --event-qps="50" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621563 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:33:51.623839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621566 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621569 2577 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621573 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621576 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621579 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621582 2577 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621585 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621588 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621591 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621594 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621597 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621600 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621603 2577 flags.go:64] FLAG: --feature-gates="" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621607 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621610 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621613 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621623 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621626 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621629 2577 flags.go:64] FLAG: --help="false" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621632 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-135-180.ec2.internal" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621635 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621638 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621641 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621646 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:33:51.624450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621649 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621652 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621655 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621658 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621661 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621664 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621667 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621670 2577 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621673 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621676 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621679 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621682 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621685 2577 flags.go:64] FLAG: --lock-file="" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621688 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621691 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621695 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621700 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621703 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621706 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621709 2577 flags.go:64] FLAG: --logging-format="text" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621712 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621715 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621718 2577 flags.go:64] FLAG: --manifest-url="" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621721 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621726 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:33:51.625041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621735 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621739 2577 flags.go:64] FLAG: --max-pods="110" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621742 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621745 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621748 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621751 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621755 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621758 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621761 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621769 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621772 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621776 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621779 2577 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621782 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621788 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621791 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621794 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621797 2577 flags.go:64] FLAG: --port="10250" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621800 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621803 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d2da050ec3097db6" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621806 2577 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621813 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621816 2577 flags.go:64] FLAG: --register-node="true" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621819 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:33:51.625668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621822 2577 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621826 2577 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621828 2577 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621831 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621834 2577 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621838 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621841 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621844 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621846 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621855 2577 flags.go:64] FLAG: --runonce="false" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621858 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621861 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621864 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621867 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621870 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621873 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621877 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621880 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621883 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621886 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621889 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621891 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621895 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621898 2577 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621901 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:33:51.626243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621906 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621909 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621912 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621916 2577 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621920 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621923 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621926 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621929 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621932 2577 flags.go:64] FLAG: --v="2" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621937 2577 flags.go:64] FLAG: --version="false" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621941 2577 flags.go:64] FLAG: --vmodule="" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621945 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.621948 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622062 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622065 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622068 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622071 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622081 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622084 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622087 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622089 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622092 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622095 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:51.626851 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622098 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622100 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622103 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622105 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622108 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622111 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622113 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622116 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622118 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622121 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622124 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622126 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622128 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622132 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622135 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622138 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622140 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622143 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622145 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622148 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:51.627410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622163 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622166 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622169 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622171 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622174 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622177 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622179 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622188 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622190 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622193 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622196 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622202 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622204 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622207 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622209 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622212 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622215 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622217 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622220 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622223 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:51.627953 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622226 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622229 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622231 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622235 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622240 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622245 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622248 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622251 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622254 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622256 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622259 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622262 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622264 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622267 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622269 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622272 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622274 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622277 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622280 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:51.628462 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622282 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622291 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622293 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622296 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622301 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622305 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622308 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622310 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622313 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622315 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622318 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622321 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622323 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622326 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622330 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622332 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:51.628929 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.622335 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:51.629332 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.623021 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:33:51.630498 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.630380 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:33:51.630540 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.630500 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:33:51.630923 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630914 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:51.630923 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630923 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630927 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630930 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630933 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630936 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630939 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630941 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630944 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630947 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630950 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630953 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630957 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630961 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630964 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630967 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630970 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630973 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630976 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630979 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630981 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:51.630979 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630985 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630987 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630990 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630993 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630996 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.630999 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631001 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631004 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631006 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631009 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631013 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631016 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631018 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631021 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631023 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631026 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631029 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631031 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631034 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631036 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:51.631486 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631039 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631041 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631044 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631046 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631049 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631051 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631054 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631057 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631060 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631062 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631064 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631067 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631069 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631072 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631076 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631078 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631081 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631084 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631086 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631089 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:51.631990 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631091 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631094 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631096 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631100 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631102 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631105 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631107 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631110 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631113 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631115 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631118 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631120 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631123 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631125 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631128 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631130 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631133 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631135 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631138 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:51.632532 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631141 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631145 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631166 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631171 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631174 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631177 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.631182 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631280 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631286 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631289 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631292 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631295 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631297 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631301 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631305 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:51.633003 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631308 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631311 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631314 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631317 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631319 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631322 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631324 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631327 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631329 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631332 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631334 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631336 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631339 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631341 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631345 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631349 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631351 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631354 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631357 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631360 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:51.633389 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631363 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631366 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631368 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631371 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631375 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631378 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631380 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631383 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631386 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631388 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631391 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631393 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631396 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631398 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631400 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631403 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631406 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631408 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631410 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:51.633926 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631413 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631415 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631418 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631420 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631423 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631425 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631428 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631430 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631433 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631435 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631437 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631440 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631442 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631445 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631447 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631450 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631453 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631456 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631458 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631461 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:51.634396 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631464 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631466 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631469 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631471 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631474 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631476 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631479 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631481 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631484 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631487 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631489 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631492 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631494 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631497 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631500 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631503 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631506 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631508 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:51.634873 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:51.631511 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:51.635376 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.631516 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:33:51.635376 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.631628 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:33:51.635376 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.633678 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:33:51.635376 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.634614 2577 server.go:1019] "Starting client certificate rotation" Apr 17 14:33:51.635376 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.634711 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:33:51.635376 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.634748 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:33:51.661798 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.661776 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:33:51.665220 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.665189 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:33:51.680900 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.680876 2577 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:33:51.686826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.686809 2577 log.go:25] "Validated CRI v1 image API" Apr 17 14:33:51.691030 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.691007 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:33:51.694395 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.694370 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8050a6f1-4257-485d-9b8e-215799a9743b:/dev/nvme0n1p3 826d49f1-c2b9-4fe4-873f-3d705c6c603a:/dev/nvme0n1p4] Apr 17 14:33:51.694492 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.694394 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:33:51.696710 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.696684 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:33:51.699884 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.699772 2577 manager.go:217] Machine: {Timestamp:2026-04-17 14:33:51.698479861 +0000 UTC m=+0.435707853 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097613 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21497a026434a58d868fd31b26210e SystemUUID:ec21497a-0264-34a5-8d86-8fd31b26210e BootID:f73da485-407c-4869-a427-43863e5db85e Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:53:f6:42:6e:a1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:53:f6:42:6e:a1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4e:79:0a:4a:71:42 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:33:51.699884 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.699878 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:33:51.700024 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.700012 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:33:51.701149 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.701123 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:33:51.701320 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.701164 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-180.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:33:51.701367 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.701332 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:33:51.701367 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.701341 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:33:51.701367 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.701355 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:33:51.701451 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.701368 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:33:51.702807 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.702796 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:33:51.702925 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.702916 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:33:51.705832 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.705822 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:33:51.705870 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.705836 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:33:51.705870 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.705848 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:33:51.705870 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.705858 2577 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:33:51.705870 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.705866 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:33:51.706983 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.706971 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:33:51.707050 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.706989 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:33:51.711063 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.711026 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:33:51.712616 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.712600 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:33:51.714488 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714474 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:33:51.714564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714497 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:33:51.714564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714506 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:33:51.714564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714515 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:33:51.714564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714524 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:33:51.714564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714534 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:33:51.714564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714542 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:33:51.714564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714551 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:33:51.714564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714562 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:33:51.714768 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714571 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:33:51.714768 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714581 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:33:51.714768 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.714590 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:33:51.716289 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.716276 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:33:51.716322 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.716292 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:33:51.719393 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.719366 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 14:33:51.719393 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.719372 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-180.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 14:33:51.719879 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.719868 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:33:51.719926 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.719905 2577 server.go:1295] "Started kubelet" Apr 17 14:33:51.719973 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.719949 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:33:51.720103 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.720054 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:33:51.720204 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.720121 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:33:51.720774 ip-10-0-135-180 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:33:51.721291 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.721221 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:33:51.722681 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.722664 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:33:51.727029 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.727007 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:33:51.727685 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.727539 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:33:51.728291 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.728262 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:33:51.728403 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.728388 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:33:51.728530 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.728512 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:33:51.728712 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.728695 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:33:51.728712 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.728710 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:33:51.729048 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729032 2577 factory.go:55] Registering systemd factory Apr 17 14:33:51.729109 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.729045 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:33:51.729109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729058 2577 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:33:51.729350 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729337 2577 factory.go:153] Registering CRI-O factory Apr 17 14:33:51.729396 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729352 2577 factory.go:223] Registration of the crio container factory successfully Apr 17 14:33:51.729430 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729417 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:33:51.729493 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729456 2577 factory.go:103] Registering Raw factory Apr 17 14:33:51.729493 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729474 2577 manager.go:1196] Started watching for new ooms in manager Apr 17 14:33:51.729493 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.729468 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:51.729915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729897 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-180.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 14:33:51.730008 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.729977 2577 manager.go:319] Starting recovery of all containers Apr 17 14:33:51.731083 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.729990 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-180.ec2.internal.18a72b85ceb9d08f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-180.ec2.internal,UID:ip-10-0-135-180.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-180.ec2.internal,},FirstTimestamp:2026-04-17 14:33:51.719878799 +0000 UTC m=+0.457106791,LastTimestamp:2026-04-17 14:33:51.719878799 +0000 UTC m=+0.457106791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-180.ec2.internal,}" Apr 17 14:33:51.733829 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.733803 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 14:33:51.733910 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.733841 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-180.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 14:33:51.735766 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.735736 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6mc2z" Apr 17 14:33:51.740710 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.740688 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6mc2z" Apr 17 14:33:51.742413 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.742394 2577 manager.go:324] Recovery completed Apr 17 14:33:51.746745 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.746733 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:51.749513 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.749494 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:51.749585 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.749531 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:51.749585 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.749542 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:51.750057 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.750039 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:33:51.750057 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.750056 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:33:51.750167 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.750076 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:33:51.751371 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.751306 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-180.ec2.internal.18a72b85d07dfbcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-180.ec2.internal,UID:ip-10-0-135-180.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-180.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-180.ec2.internal,},FirstTimestamp:2026-04-17 14:33:51.749512139 +0000 UTC m=+0.486740138,LastTimestamp:2026-04-17 14:33:51.749512139 +0000 UTC m=+0.486740138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-180.ec2.internal,}" Apr 17 14:33:51.753165 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.753142 2577 policy_none.go:49] "None policy: Start" Apr 17 14:33:51.753220 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.753170 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:33:51.753220 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.753181 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:33:51.784924 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.784909 2577 manager.go:341] "Starting Device Plugin manager" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.784968 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.784983 2577 server.go:85] "Starting device plugin registration server" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.785258 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.785270 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.785346 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.785432 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.785447 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.786338 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.786379 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.793557 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.794892 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.794917 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.794940 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.794948 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:33:51.796919 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.795044 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:33:51.797412 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.796997 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:51.886223 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.886115 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:51.887186 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.887166 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:51.887284 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.887197 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:51.887284 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.887208 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:51.887284 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.887240 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-180.ec2.internal" Apr 17 14:33:51.895506 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.895487 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-180.ec2.internal" Apr 17 14:33:51.895615 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.895515 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-180.ec2.internal\": node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:51.895615 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.895606 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal"] Apr 17 14:33:51.895716 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.895669 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:51.896542 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.896526 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:51.896631 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.896558 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:51.896631 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.896571 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:51.897825 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.897813 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:51.898001 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.897985 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:51.898064 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.898021 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:51.898605 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.898574 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:51.898605 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.898582 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:51.898750 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.898633 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:51.898750 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.898653 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:51.898750 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.898606 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:51.898750 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.898685 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:51.900105 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.900086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" Apr 17 14:33:51.900210 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.900112 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:51.900841 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.900825 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:51.900907 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.900855 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:51.900907 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:51.900871 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:51.911072 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.911054 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:51.926719 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.926695 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-180.ec2.internal\" not found" node="ip-10-0-135-180.ec2.internal" Apr 17 14:33:51.931068 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:51.931052 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-180.ec2.internal\" not found" node="ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.011674 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.011636 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.029889 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.029865 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b44101f799cef1b7257884fc1ea11bb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal\" (UID: \"2b44101f799cef1b7257884fc1ea11bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.029994 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.029897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6caffc9b9af5358bace19aa9382d0587-config\") pod \"kube-apiserver-proxy-ip-10-0-135-180.ec2.internal\" (UID: \"6caffc9b9af5358bace19aa9382d0587\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.030277 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.030246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2b44101f799cef1b7257884fc1ea11bb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal\" (UID: \"2b44101f799cef1b7257884fc1ea11bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.111845 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.111795 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.131421 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.131394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b44101f799cef1b7257884fc1ea11bb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal\" (UID: \"2b44101f799cef1b7257884fc1ea11bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.131489 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.131427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6caffc9b9af5358bace19aa9382d0587-config\") pod \"kube-apiserver-proxy-ip-10-0-135-180.ec2.internal\" (UID: \"6caffc9b9af5358bace19aa9382d0587\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.131489 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.131443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2b44101f799cef1b7257884fc1ea11bb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal\" (UID: \"2b44101f799cef1b7257884fc1ea11bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.131555 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.131510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6caffc9b9af5358bace19aa9382d0587-config\") pod \"kube-apiserver-proxy-ip-10-0-135-180.ec2.internal\" (UID: \"6caffc9b9af5358bace19aa9382d0587\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.131555 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.131515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b44101f799cef1b7257884fc1ea11bb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal\" (UID: \"2b44101f799cef1b7257884fc1ea11bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.131555 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.131510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2b44101f799cef1b7257884fc1ea11bb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal\" (UID: \"2b44101f799cef1b7257884fc1ea11bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.212893 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.212812 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.230092 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.230064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.233419 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.233210 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" Apr 17 14:33:52.313041 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.312983 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.413522 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.413483 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.514068 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.513974 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.614526 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.614497 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.634932 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.634904 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:33:52.635352 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.635087 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:33:52.715583 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.715540 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.727749 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.727718 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:33:52.743165 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.743123 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:28:51 +0000 UTC" deadline="2027-09-11 11:03:24.444155321 +0000 UTC" Apr 17 14:33:52.743165 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.743149 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12284h29m31.701008736s" Apr 17 14:33:52.748545 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.748520 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:33:52.768585 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.768511 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6lk26" Apr 17 14:33:52.776189 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.776144 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6lk26" Apr 17 14:33:52.815808 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.815771 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.887023 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:52.886773 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6caffc9b9af5358bace19aa9382d0587.slice/crio-4ae5c4034e0015dd73bb97280e7d3568ea65877c668fb452fb7578b6b37d8e3d WatchSource:0}: Error finding container 4ae5c4034e0015dd73bb97280e7d3568ea65877c668fb452fb7578b6b37d8e3d: Status 404 returned error can't find the container with id 4ae5c4034e0015dd73bb97280e7d3568ea65877c668fb452fb7578b6b37d8e3d Apr 17 14:33:52.887366 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:52.887342 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b44101f799cef1b7257884fc1ea11bb.slice/crio-567a7f54e38921e0becb96de8f71260e3d3657a63736dd582cdaafffe06f8b7f WatchSource:0}: Error finding container 567a7f54e38921e0becb96de8f71260e3d3657a63736dd582cdaafffe06f8b7f: Status 404 returned error can't find the container with id 567a7f54e38921e0becb96de8f71260e3d3657a63736dd582cdaafffe06f8b7f Apr 17 14:33:52.891350 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.891334 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:33:52.916258 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:52.916210 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:52.987129 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:52.987100 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:53.017076 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.017042 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:53.018087 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.018073 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:53.117281 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.117180 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:53.218017 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.217970 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-180.ec2.internal\" not found" Apr 17 14:33:53.240756 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.240731 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:53.328631 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.328592 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" Apr 17 14:33:53.336962 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.336933 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:33:53.337947 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.337930 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" Apr 17 14:33:53.350209 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.350188 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:33:53.707141 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.707107 2577 apiserver.go:52] "Watching apiserver" Apr 17 14:33:53.714532 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.714502 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:33:53.714958 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.714933 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal","openshift-image-registry/node-ca-tf94b","openshift-multus/multus-additional-cni-plugins-49c8v","openshift-network-diagnostics/network-check-target-v8n74","openshift-network-operator/iptables-alerter-vfggb","openshift-ovn-kubernetes/ovnkube-node-pbxb9","kube-system/konnectivity-agent-7vwcw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4","openshift-cluster-node-tuning-operator/tuned-jtqxv","openshift-dns/node-resolver-v95rv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal","openshift-multus/multus-sr7t4","openshift-multus/network-metrics-daemon-4nvfl"] Apr 17 14:33:53.717991 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.717964 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:33:53.720488 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.720323 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-tgpzr\"" Apr 17 14:33:53.720488 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.720407 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:33:53.720488 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.720329 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:33:53.722509 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.722350 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:53.722509 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.722437 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:33:53.722768 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.722752 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.725326 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.725304 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:33:53.725914 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.725591 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:33:53.725914 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.725606 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-njt8v\"" Apr 17 14:33:53.725914 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.725661 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.725914 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.725668 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:33:53.727963 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.727939 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2cbqb\"" Apr 17 14:33:53.727963 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.727950 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:33:53.728118 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.727941 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:33:53.730113 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.730096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.730531 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.730174 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.732356 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.732336 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:33:53.732462 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.732412 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:33:53.732688 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.732667 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.732768 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.732735 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jrs7q\"" Apr 17 14:33:53.732817 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.732775 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:33:53.733309 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.733006 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:33:53.733309 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.733065 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jxqxc\"" Apr 17 14:33:53.733522 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.733338 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:33:53.733522 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.733457 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:33:53.733522 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.733480 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:33:53.733681 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.733460 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:33:53.734923 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.734904 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:33:53.735051 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.734996 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:33:53.735886 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.735563 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-g6m69\"" Apr 17 14:33:53.735886 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.735706 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:33:53.735886 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.735720 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:33:53.735886 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.735759 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:33:53.735886 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.735775 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:33:53.737344 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.737325 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.737432 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.737403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.739676 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.739370 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:33:53.739676 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.739455 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:33:53.739871 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.739855 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.739967 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.739946 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-hthmd\"" Apr 17 14:33:53.740091 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740032 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:33:53.740337 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740231 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:33:53.740337 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740292 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vbldb\"" Apr 17 14:33:53.740469 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b72db081-842b-4ea0-a24a-f1f7916d7ee5-host\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.740469 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:53.740678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-os-release\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.740678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d17101b-12ed-4f95-89fd-3e67caa7db02-host-slash\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.740678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tf2k\" (UniqueName: \"kubernetes.io/projected/b72db081-842b-4ea0-a24a-f1f7916d7ee5-kube-api-access-7tf2k\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.740678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d058fa6d-b2cb-4092-b4f9-49dee965fff3-hosts-file\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.740678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzrmg\" (UniqueName: \"kubernetes.io/projected/d058fa6d-b2cb-4092-b4f9-49dee965fff3-kube-api-access-jzrmg\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.740678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-cnibin\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.740678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.740678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-system-cni-dir\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740734 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a095d6e7-582d-469a-8e0f-5f14ef60b4bf-konnectivity-ca\") pod \"konnectivity-agent-7vwcw\" (UID: \"a095d6e7-582d-469a-8e0f-5f14ef60b4bf\") " pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b72db081-842b-4ea0-a24a-f1f7916d7ee5-serviceca\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d058fa6d-b2cb-4092-b4f9-49dee965fff3-tmp-dir\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn44h\" (UniqueName: \"kubernetes.io/projected/47a7cca6-6a79-404f-9826-84de279713bb-kube-api-access-cn44h\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9d17101b-12ed-4f95-89fd-3e67caa7db02-iptables-alerter-script\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nspf\" (UniqueName: \"kubernetes.io/projected/9d17101b-12ed-4f95-89fd-3e67caa7db02-kube-api-access-7nspf\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.740937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a095d6e7-582d-469a-8e0f-5f14ef60b4bf-agent-certs\") pod \"konnectivity-agent-7vwcw\" (UID: \"a095d6e7-582d-469a-8e0f-5f14ef60b4bf\") " pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:33:53.741058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.741003 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:33:53.742384 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.742182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8f7nj\"" Apr 17 14:33:53.742475 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.742445 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:53.742531 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.742506 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:33:53.743084 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.742766 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:33:53.777616 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.777582 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:28:52 +0000 UTC" deadline="2027-11-11 07:18:59.400463567 +0000 UTC" Apr 17 14:33:53.777616 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.777616 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13744h45m5.622851433s" Apr 17 14:33:53.801967 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.801912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" event={"ID":"2b44101f799cef1b7257884fc1ea11bb","Type":"ContainerStarted","Data":"567a7f54e38921e0becb96de8f71260e3d3657a63736dd582cdaafffe06f8b7f"} Apr 17 14:33:53.803571 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.803544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" event={"ID":"6caffc9b9af5358bace19aa9382d0587","Type":"ContainerStarted","Data":"4ae5c4034e0015dd73bb97280e7d3568ea65877c668fb452fb7578b6b37d8e3d"} Apr 17 14:33:53.829671 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.829647 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:33:53.841851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.841824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn44h\" (UniqueName: \"kubernetes.io/projected/47a7cca6-6a79-404f-9826-84de279713bb-kube-api-access-cn44h\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.841982 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.841861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-log-socket\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.841982 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.841880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.841982 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.841896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-lib-modules\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.841982 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.841913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-multus-certs\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.842131 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.841964 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5mmn\" (UniqueName: \"kubernetes.io/projected/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-kube-api-access-j5mmn\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:53.842131 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842028 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-kubernetes\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.842131 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-tmp\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.842131 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-os-release\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.842131 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-cni-bin\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.842131 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-cni-multus\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.842351 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-sys-fs\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.842351 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d17101b-12ed-4f95-89fd-3e67caa7db02-host-slash\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.842351 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d058fa6d-b2cb-4092-b4f9-49dee965fff3-hosts-file\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.842351 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96429a18-321a-48f1-8136-1ce5473e98ee-ovn-node-metrics-cert\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.842351 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d17101b-12ed-4f95-89fd-3e67caa7db02-host-slash\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.842351 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysctl-d\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.842351 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-k8s-cni-cncf-io\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.842351 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d058fa6d-b2cb-4092-b4f9-49dee965fff3-hosts-file\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-device-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b72db081-842b-4ea0-a24a-f1f7916d7ee5-serviceca\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-etc-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-ovn\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-node-log\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-cni-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-sys\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b72db081-842b-4ea0-a24a-f1f7916d7ee5-host\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.842610 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842606 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-slash\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842655 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b72db081-842b-4ea0-a24a-f1f7916d7ee5-host\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-run-netns\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-ovnkube-config\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tncrd\" (UniqueName: \"kubernetes.io/projected/1625a2e5-b33d-4e5d-85f4-286dd067f854-kube-api-access-tncrd\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-registration-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b72db081-842b-4ea0-a24a-f1f7916d7ee5-serviceca\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-system-cni-dir\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-cni-bin\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-system-cni-dir\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-env-overrides\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842910 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-system-cni-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-daemon-config\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p49cf\" (UniqueName: \"kubernetes.io/projected/622b093f-2902-4bca-b6a5-3cad2c516a5a-kube-api-access-p49cf\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.842997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.843094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d058fa6d-b2cb-4092-b4f9-49dee965fff3-tmp-dir\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysctl-conf\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-systemd-units\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-run\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-conf-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843240 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-etc-kubernetes\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzrmg\" (UniqueName: \"kubernetes.io/projected/d058fa6d-b2cb-4092-b4f9-49dee965fff3-kube-api-access-jzrmg\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-cnibin\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d058fa6d-b2cb-4092-b4f9-49dee965fff3-tmp-dir\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-cnibin\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-kubelet\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-systemd\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mwk\" (UniqueName: \"kubernetes.io/projected/96429a18-321a-48f1-8136-1ce5473e98ee-kube-api-access-x2mwk\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.843849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-var-lib-kubelet\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-ovnkube-script-lib\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1625a2e5-b33d-4e5d-85f4-286dd067f854-cni-binary-copy\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-socket-dir-parent\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-netns\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-socket-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-host\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843650 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxlbv\" (UniqueName: \"kubernetes.io/projected/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-kube-api-access-vxlbv\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843685 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9d17101b-12ed-4f95-89fd-3e67caa7db02-iptables-alerter-script\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nspf\" (UniqueName: \"kubernetes.io/projected/9d17101b-12ed-4f95-89fd-3e67caa7db02-kube-api-access-7nspf\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a095d6e7-582d-469a-8e0f-5f14ef60b4bf-agent-certs\") pod \"konnectivity-agent-7vwcw\" (UID: \"a095d6e7-582d-469a-8e0f-5f14ef60b4bf\") " pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-os-release\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843933 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-tuned\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-cnibin\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.844557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.843971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844020 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47a7cca6-6a79-404f-9826-84de279713bb-os-release\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844048 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-kubelet\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tf2k\" (UniqueName: \"kubernetes.io/projected/b72db081-842b-4ea0-a24a-f1f7916d7ee5-kube-api-access-7tf2k\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-var-lib-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-cni-netd\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-modprobe-d\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysconfig\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-systemd\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-hostroot\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844272 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9d17101b-12ed-4f95-89fd-3e67caa7db02-iptables-alerter-script\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a095d6e7-582d-469a-8e0f-5f14ef60b4bf-konnectivity-ca\") pod \"konnectivity-agent-7vwcw\" (UID: \"a095d6e7-582d-469a-8e0f-5f14ef60b4bf\") " pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.844865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47a7cca6-6a79-404f-9826-84de279713bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.845078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.845028 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a095d6e7-582d-469a-8e0f-5f14ef60b4bf-konnectivity-ca\") pod \"konnectivity-agent-7vwcw\" (UID: \"a095d6e7-582d-469a-8e0f-5f14ef60b4bf\") " pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:33:53.847467 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.847449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a095d6e7-582d-469a-8e0f-5f14ef60b4bf-agent-certs\") pod \"konnectivity-agent-7vwcw\" (UID: \"a095d6e7-582d-469a-8e0f-5f14ef60b4bf\") " pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:33:53.852953 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.852933 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:33:53.852953 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.852955 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:33:53.853129 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.852964 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9j64g for pod openshift-network-diagnostics/network-check-target-v8n74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:53.853129 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.853028 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g podName:5b856db3-62df-499f-acb9-ca566ed0be77 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:54.352999052 +0000 UTC m=+3.090227040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9j64g" (UniqueName: "kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g") pod "network-check-target-v8n74" (UID: "5b856db3-62df-499f-acb9-ca566ed0be77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:53.855264 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.855209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn44h\" (UniqueName: \"kubernetes.io/projected/47a7cca6-6a79-404f-9826-84de279713bb-kube-api-access-cn44h\") pod \"multus-additional-cni-plugins-49c8v\" (UID: \"47a7cca6-6a79-404f-9826-84de279713bb\") " pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:53.855479 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.855456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzrmg\" (UniqueName: \"kubernetes.io/projected/d058fa6d-b2cb-4092-b4f9-49dee965fff3-kube-api-access-jzrmg\") pod \"node-resolver-v95rv\" (UID: \"d058fa6d-b2cb-4092-b4f9-49dee965fff3\") " pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:53.856167 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.856130 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nspf\" (UniqueName: \"kubernetes.io/projected/9d17101b-12ed-4f95-89fd-3e67caa7db02-kube-api-access-7nspf\") pod \"iptables-alerter-vfggb\" (UID: \"9d17101b-12ed-4f95-89fd-3e67caa7db02\") " pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:53.856741 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.856697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tf2k\" (UniqueName: \"kubernetes.io/projected/b72db081-842b-4ea0-a24a-f1f7916d7ee5-kube-api-access-7tf2k\") pod \"node-ca-tf94b\" (UID: \"b72db081-842b-4ea0-a24a-f1f7916d7ee5\") " pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:53.945013 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.944969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p49cf\" (UniqueName: \"kubernetes.io/projected/622b093f-2902-4bca-b6a5-3cad2c516a5a-kube-api-access-p49cf\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.945207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.945207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysctl-conf\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.945207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-systemd-units\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.945207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-run\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.945207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-conf-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.945207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-conf-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-etc-kubernetes\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-run\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-kubelet\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysctl-conf\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-systemd\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-systemd\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-etc-kubernetes\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mwk\" (UniqueName: \"kubernetes.io/projected/96429a18-321a-48f1-8136-1ce5473e98ee-kube-api-access-x2mwk\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-systemd-units\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-var-lib-kubelet\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.945494 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-kubelet\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-ovnkube-script-lib\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1625a2e5-b33d-4e5d-85f4-286dd067f854-cni-binary-copy\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-socket-dir-parent\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-var-lib-kubelet\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-netns\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-socket-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-host\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxlbv\" (UniqueName: \"kubernetes.io/projected/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-kube-api-access-vxlbv\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945665 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-netns\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-tuned\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-socket-dir-parent\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-cnibin\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-kubelet\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-var-lib-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-var-lib-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-host\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-cnibin\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945946 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-cni-netd\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-kubelet\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.945975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-modprobe-d\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysconfig\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-systemd\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-hostroot\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-log-socket\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-socket-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-lib-modules\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1625a2e5-b33d-4e5d-85f4-286dd067f854-cni-binary-copy\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946222 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-ovnkube-script-lib\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-multus-certs\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-multus-certs\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-cni-netd\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5mmn\" (UniqueName: \"kubernetes.io/projected/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-kube-api-access-j5mmn\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-systemd\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.946829 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-kubernetes\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-tmp\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-lib-modules\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946341 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-modprobe-d\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-hostroot\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-log-socket\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946382 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-kubernetes\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysconfig\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-os-release\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-cni-bin\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-cni-multus\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-sys-fs\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96429a18-321a-48f1-8136-1ce5473e98ee-ovn-node-metrics-cert\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysctl-d\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-cni-bin\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-var-lib-cni-multus\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.947641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-k8s-cni-cncf-io\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-host-run-k8s-cni-cncf-io\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-device-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-etc-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-sysctl-d\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-ovn\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-node-log\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-device-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-cni-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-ovn\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.946805 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-sys\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-slash\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-sys-fs\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:53.946881 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:54.446863392 +0000 UTC m=+3.184091388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:53.948465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-slash\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-run-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-run-netns\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-run-netns\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-ovnkube-config\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947001 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tncrd\" (UniqueName: \"kubernetes.io/projected/1625a2e5-b33d-4e5d-85f4-286dd067f854-kube-api-access-tncrd\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-node-log\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.946829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-etc-openvswitch\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-registration-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-cni-bin\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947100 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-cni-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-sys\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-env-overrides\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947169 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-system-cni-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947168 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/622b093f-2902-4bca-b6a5-3cad2c516a5a-registration-dir\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947188 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96429a18-321a-48f1-8136-1ce5473e98ee-host-cni-bin\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.949265 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-daemon-config\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.950123 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947313 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-os-release\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.950123 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1625a2e5-b33d-4e5d-85f4-286dd067f854-system-cni-dir\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.950123 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-ovnkube-config\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.950123 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96429a18-321a-48f1-8136-1ce5473e98ee-env-overrides\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.950123 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.947741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1625a2e5-b33d-4e5d-85f4-286dd067f854-multus-daemon-config\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.950123 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.948954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-etc-tuned\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.950123 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.949687 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-tmp\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.950409 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.950195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96429a18-321a-48f1-8136-1ce5473e98ee-ovn-node-metrics-cert\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:53.958047 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.957983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxlbv\" (UniqueName: \"kubernetes.io/projected/b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c-kube-api-access-vxlbv\") pod \"tuned-jtqxv\" (UID: \"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c\") " pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:53.958274 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.958220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tncrd\" (UniqueName: \"kubernetes.io/projected/1625a2e5-b33d-4e5d-85f4-286dd067f854-kube-api-access-tncrd\") pod \"multus-sr7t4\" (UID: \"1625a2e5-b33d-4e5d-85f4-286dd067f854\") " pod="openshift-multus/multus-sr7t4" Apr 17 14:33:53.958600 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.958574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5mmn\" (UniqueName: \"kubernetes.io/projected/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-kube-api-access-j5mmn\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:53.958851 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.958826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p49cf\" (UniqueName: \"kubernetes.io/projected/622b093f-2902-4bca-b6a5-3cad2c516a5a-kube-api-access-p49cf\") pod \"aws-ebs-csi-driver-node-22gz4\" (UID: \"622b093f-2902-4bca-b6a5-3cad2c516a5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:53.959525 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:53.959502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mwk\" (UniqueName: \"kubernetes.io/projected/96429a18-321a-48f1-8136-1ce5473e98ee-kube-api-access-x2mwk\") pod \"ovnkube-node-pbxb9\" (UID: \"96429a18-321a-48f1-8136-1ce5473e98ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:54.038620 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.038582 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:33:54.044416 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.044392 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tf94b" Apr 17 14:33:54.053188 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.053161 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v95rv" Apr 17 14:33:54.060824 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.060796 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-49c8v" Apr 17 14:33:54.069485 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.069460 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vfggb" Apr 17 14:33:54.077145 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.077122 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:33:54.085855 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.085833 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" Apr 17 14:33:54.089383 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.089361 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:54.093183 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.093165 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" Apr 17 14:33:54.099760 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.099738 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sr7t4" Apr 17 14:33:54.403844 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.403751 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-r2ggk"] Apr 17 14:33:54.406718 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.406699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.406851 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.406776 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:33:54.450323 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.450284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:54.450478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.450345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-kubelet-config\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.450478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.450377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-dbus\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.450478 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.450418 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:54.450478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.450465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:54.450685 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.450483 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:55.450466769 +0000 UTC m=+4.187694764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:54.450685 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.450569 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:33:54.450685 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.450588 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:33:54.450685 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.450602 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9j64g for pod openshift-network-diagnostics/network-check-target-v8n74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:54.450685 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.450611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.450685 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.450651 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g podName:5b856db3-62df-499f-acb9-ca566ed0be77 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:55.450634703 +0000 UTC m=+4.187862684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9j64g" (UniqueName: "kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g") pod "network-check-target-v8n74" (UID: "5b856db3-62df-499f-acb9-ca566ed0be77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:54.551227 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.551195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.551418 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.551263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-kubelet-config\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.551418 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.551292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-dbus\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.551418 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.551329 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:33:54.551418 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.551389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-dbus\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.551418 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:54.551398 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret podName:8cd56fcc-a291-4dce-ab6e-46a29c54ed43 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:55.051378935 +0000 UTC m=+3.788606928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret") pod "global-pull-secret-syncer-r2ggk" (UID: "8cd56fcc-a291-4dce-ab6e-46a29c54ed43") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:33:54.551418 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.551409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-kubelet-config\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:54.703689 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.703652 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod622b093f_2902_4bca_b6a5_3cad2c516a5a.slice/crio-9975303eefae2b96d3d3f011eb8149da32e2cf42d7f4b949e64be62931e0e514 WatchSource:0}: Error finding container 9975303eefae2b96d3d3f011eb8149da32e2cf42d7f4b949e64be62931e0e514: Status 404 returned error can't find the container with id 9975303eefae2b96d3d3f011eb8149da32e2cf42d7f4b949e64be62931e0e514 Apr 17 14:33:54.704959 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.704834 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d17101b_12ed_4f95_89fd_3e67caa7db02.slice/crio-789fda50a3702331a9257d7a1b92dd8e97c74663e39323e588559d7eb84c906e WatchSource:0}: Error finding container 789fda50a3702331a9257d7a1b92dd8e97c74663e39323e588559d7eb84c906e: Status 404 returned error can't find the container with id 789fda50a3702331a9257d7a1b92dd8e97c74663e39323e588559d7eb84c906e Apr 17 14:33:54.708985 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.708896 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f8e225_290c_4c40_9e1f_2cbdc13a9d2c.slice/crio-c0584ea413e2ef1384f47204e45626c9926f5917010f9341f8b9676e56ddb528 WatchSource:0}: Error finding container c0584ea413e2ef1384f47204e45626c9926f5917010f9341f8b9676e56ddb528: Status 404 returned error can't find the container with id c0584ea413e2ef1384f47204e45626c9926f5917010f9341f8b9676e56ddb528 Apr 17 14:33:54.709895 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.709872 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd058fa6d_b2cb_4092_b4f9_49dee965fff3.slice/crio-afa6933fe8abf1083b8a89a39c98eda5816612405456a76df8efe94dc501cd28 WatchSource:0}: Error finding container afa6933fe8abf1083b8a89a39c98eda5816612405456a76df8efe94dc501cd28: Status 404 returned error can't find the container with id afa6933fe8abf1083b8a89a39c98eda5816612405456a76df8efe94dc501cd28 Apr 17 14:33:54.710525 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.710482 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47a7cca6_6a79_404f_9826_84de279713bb.slice/crio-d4709333e2d6b842ffe35d82f0ffbe95243582805fa946af2dc744dd70c87d07 WatchSource:0}: Error finding container d4709333e2d6b842ffe35d82f0ffbe95243582805fa946af2dc744dd70c87d07: Status 404 returned error can't find the container with id d4709333e2d6b842ffe35d82f0ffbe95243582805fa946af2dc744dd70c87d07 Apr 17 14:33:54.712373 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.712353 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1625a2e5_b33d_4e5d_85f4_286dd067f854.slice/crio-39a90d5ae3db12ab11949f1a9ccf53c194fa3c2f91086f10b2759e1cb404d06e WatchSource:0}: Error finding container 39a90d5ae3db12ab11949f1a9ccf53c194fa3c2f91086f10b2759e1cb404d06e: Status 404 returned error can't find the container with id 39a90d5ae3db12ab11949f1a9ccf53c194fa3c2f91086f10b2759e1cb404d06e Apr 17 14:33:54.712886 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.712847 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda095d6e7_582d_469a_8e0f_5f14ef60b4bf.slice/crio-274316541bd9b6757e999b292a3fb540c2404ccea63d3ef59e159f55bccb33dd WatchSource:0}: Error finding container 274316541bd9b6757e999b292a3fb540c2404ccea63d3ef59e159f55bccb33dd: Status 404 returned error can't find the container with id 274316541bd9b6757e999b292a3fb540c2404ccea63d3ef59e159f55bccb33dd Apr 17 14:33:54.713738 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.713720 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96429a18_321a_48f1_8136_1ce5473e98ee.slice/crio-9df21b6215534f35d2fc0d2918dead3e7680e9885a120234bf818267696dc2da WatchSource:0}: Error finding container 9df21b6215534f35d2fc0d2918dead3e7680e9885a120234bf818267696dc2da: Status 404 returned error can't find the container with id 9df21b6215534f35d2fc0d2918dead3e7680e9885a120234bf818267696dc2da Apr 17 14:33:54.715496 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:33:54.715473 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb72db081_842b_4ea0_a24a_f1f7916d7ee5.slice/crio-298463e4d254bbcd9fc6af2cd303f7ac71332b24b060394b2a35faf473f97968 WatchSource:0}: Error finding container 298463e4d254bbcd9fc6af2cd303f7ac71332b24b060394b2a35faf473f97968: Status 404 returned error can't find the container with id 298463e4d254bbcd9fc6af2cd303f7ac71332b24b060394b2a35faf473f97968 Apr 17 14:33:54.778037 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.778002 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:28:52 +0000 UTC" deadline="2027-12-13 08:30:03.370673194 +0000 UTC" Apr 17 14:33:54.778037 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.778034 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14513h56m8.592641825s" Apr 17 14:33:54.806770 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.806738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" event={"ID":"6caffc9b9af5358bace19aa9382d0587","Type":"ContainerStarted","Data":"f727231d63132a042113367fb39f72573139fedaa37979878a998c4ef14b4296"} Apr 17 14:33:54.807839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.807805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tf94b" event={"ID":"b72db081-842b-4ea0-a24a-f1f7916d7ee5","Type":"ContainerStarted","Data":"298463e4d254bbcd9fc6af2cd303f7ac71332b24b060394b2a35faf473f97968"} Apr 17 14:33:54.808816 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.808794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v95rv" event={"ID":"d058fa6d-b2cb-4092-b4f9-49dee965fff3","Type":"ContainerStarted","Data":"afa6933fe8abf1083b8a89a39c98eda5816612405456a76df8efe94dc501cd28"} Apr 17 14:33:54.809825 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.809805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" event={"ID":"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c","Type":"ContainerStarted","Data":"c0584ea413e2ef1384f47204e45626c9926f5917010f9341f8b9676e56ddb528"} Apr 17 14:33:54.810752 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.810730 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vfggb" event={"ID":"9d17101b-12ed-4f95-89fd-3e67caa7db02","Type":"ContainerStarted","Data":"789fda50a3702331a9257d7a1b92dd8e97c74663e39323e588559d7eb84c906e"} Apr 17 14:33:54.811735 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.811716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"9df21b6215534f35d2fc0d2918dead3e7680e9885a120234bf818267696dc2da"} Apr 17 14:33:54.812615 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.812598 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7vwcw" event={"ID":"a095d6e7-582d-469a-8e0f-5f14ef60b4bf","Type":"ContainerStarted","Data":"274316541bd9b6757e999b292a3fb540c2404ccea63d3ef59e159f55bccb33dd"} Apr 17 14:33:54.813580 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.813562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sr7t4" event={"ID":"1625a2e5-b33d-4e5d-85f4-286dd067f854","Type":"ContainerStarted","Data":"39a90d5ae3db12ab11949f1a9ccf53c194fa3c2f91086f10b2759e1cb404d06e"} Apr 17 14:33:54.814465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.814444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerStarted","Data":"d4709333e2d6b842ffe35d82f0ffbe95243582805fa946af2dc744dd70c87d07"} Apr 17 14:33:54.815320 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:54.815300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" event={"ID":"622b093f-2902-4bca-b6a5-3cad2c516a5a","Type":"ContainerStarted","Data":"9975303eefae2b96d3d3f011eb8149da32e2cf42d7f4b949e64be62931e0e514"} Apr 17 14:33:55.054772 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.054700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:55.054900 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.054835 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:33:55.054934 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.054899 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret podName:8cd56fcc-a291-4dce-ab6e-46a29c54ed43 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:56.054883311 +0000 UTC m=+4.792111292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret") pod "global-pull-secret-syncer-r2ggk" (UID: "8cd56fcc-a291-4dce-ab6e-46a29c54ed43") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:33:55.459364 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.458367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:55.459364 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.458457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:55.459364 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.458579 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:55.459364 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.458644 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:57.458624649 +0000 UTC m=+6.195852630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:55.459364 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.459113 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:33:55.459364 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.459138 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:33:55.459364 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.459167 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9j64g for pod openshift-network-diagnostics/network-check-target-v8n74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:55.459364 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.459215 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g podName:5b856db3-62df-499f-acb9-ca566ed0be77 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:57.459198832 +0000 UTC m=+6.196426815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9j64g" (UniqueName: "kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g") pod "network-check-target-v8n74" (UID: "5b856db3-62df-499f-acb9-ca566ed0be77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:55.796016 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.795930 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:55.796465 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.796069 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:33:55.796572 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.796544 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:55.796681 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.796659 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:33:55.796758 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.796746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:55.796834 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:55.796818 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:33:55.825403 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.825356 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b44101f799cef1b7257884fc1ea11bb" containerID="3818b20b33b70a38e9052e9cd13108744fee0f281b83694d3e5cb1180d0389d2" exitCode=0 Apr 17 14:33:55.826508 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.826477 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" event={"ID":"2b44101f799cef1b7257884fc1ea11bb","Type":"ContainerDied","Data":"3818b20b33b70a38e9052e9cd13108744fee0f281b83694d3e5cb1180d0389d2"} Apr 17 14:33:55.840320 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:55.839460 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-180.ec2.internal" podStartSLOduration=2.839441081 podStartE2EDuration="2.839441081s" podCreationTimestamp="2026-04-17 14:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:33:55.839020222 +0000 UTC m=+4.576248223" watchObservedRunningTime="2026-04-17 14:33:55.839441081 +0000 UTC m=+4.576669083" Apr 17 14:33:56.064648 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:56.064549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:56.064869 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:56.064848 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:33:56.064943 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:56.064923 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret podName:8cd56fcc-a291-4dce-ab6e-46a29c54ed43 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:58.064903344 +0000 UTC m=+6.802131358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret") pod "global-pull-secret-syncer-r2ggk" (UID: "8cd56fcc-a291-4dce-ab6e-46a29c54ed43") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:33:56.856324 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:56.856276 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" event={"ID":"2b44101f799cef1b7257884fc1ea11bb","Type":"ContainerStarted","Data":"67d7cb9e711a117fa368386097a939a963b6ac3ed3474aebcd29a5fa8b5604dd"} Apr 17 14:33:57.478693 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:57.478651 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:57.478888 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:57.478736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:57.478888 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.478867 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:57.478999 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.478927 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:01.478909348 +0000 UTC m=+10.216137329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:57.479366 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.479347 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:33:57.479445 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.479370 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:33:57.479445 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.479384 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9j64g for pod openshift-network-diagnostics/network-check-target-v8n74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:57.479445 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.479433 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g podName:5b856db3-62df-499f-acb9-ca566ed0be77 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:01.479413828 +0000 UTC m=+10.216641812 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9j64g" (UniqueName: "kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g") pod "network-check-target-v8n74" (UID: "5b856db3-62df-499f-acb9-ca566ed0be77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:57.797346 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:57.797264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:57.797519 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.797390 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:33:57.797798 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:57.797779 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:57.797910 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.797889 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:33:57.798440 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:57.798395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:57.798554 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:57.798478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:33:58.084570 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:58.083916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:58.084570 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:58.084089 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:33:58.084570 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:58.084165 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret podName:8cd56fcc-a291-4dce-ab6e-46a29c54ed43 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:02.084133966 +0000 UTC m=+10.821361945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret") pod "global-pull-secret-syncer-r2ggk" (UID: "8cd56fcc-a291-4dce-ab6e-46a29c54ed43") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:33:59.795984 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:59.795946 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:33:59.796460 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:59.796107 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:33:59.796460 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:59.795945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:33:59.796460 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:59.796232 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:33:59.796460 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:33:59.795945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:33:59.796460 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:33:59.796314 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:01.511754 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:01.511713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:01.512261 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:01.511870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:01.512261 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.511907 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:01.512261 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.511937 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:01.512261 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.511951 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9j64g for pod openshift-network-diagnostics/network-check-target-v8n74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:01.512261 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.512004 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:01.512261 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.512014 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g podName:5b856db3-62df-499f-acb9-ca566ed0be77 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:09.511994005 +0000 UTC m=+18.249222004 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9j64g" (UniqueName: "kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g") pod "network-check-target-v8n74" (UID: "5b856db3-62df-499f-acb9-ca566ed0be77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:01.512261 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.512060 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:09.512042821 +0000 UTC m=+18.249270804 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:01.795986 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:01.795903 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:01.796140 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.796006 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:01.796140 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:01.796039 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:01.796441 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:01.796185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:01.796441 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.796212 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:01.796441 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:01.796290 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:02.116691 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:02.116352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:02.116691 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:02.116526 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:02.116912 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:02.116738 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret podName:8cd56fcc-a291-4dce-ab6e-46a29c54ed43 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:10.11671777 +0000 UTC m=+18.853945752 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret") pod "global-pull-secret-syncer-r2ggk" (UID: "8cd56fcc-a291-4dce-ab6e-46a29c54ed43") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:03.795362 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.795318 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:03.795362 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.795353 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:03.796094 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:03.795432 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:03.796094 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:03.795513 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:03.796094 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.795549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:03.796094 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:03.795622 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:03.871283 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.871244 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7vwcw" event={"ID":"a095d6e7-582d-469a-8e0f-5f14ef60b4bf","Type":"ContainerStarted","Data":"0dc67dacdaaf481eb75b332f2c9d2cb7daa45814889b757e6019fd4a5bd141a5"} Apr 17 14:34:03.873664 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.873631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerStarted","Data":"3704b0dfc0c2f43f1817347e0ebf48af7d1083a737523644581b2e9974c4f903"} Apr 17 14:34:03.875927 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.875784 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" event={"ID":"622b093f-2902-4bca-b6a5-3cad2c516a5a","Type":"ContainerStarted","Data":"7c76ba10294d65476892570495b97340c7100e452feb39333dae5b85ede964ea"} Apr 17 14:34:03.878178 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.878114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tf94b" event={"ID":"b72db081-842b-4ea0-a24a-f1f7916d7ee5","Type":"ContainerStarted","Data":"9177156ab0124089be82eefb6bee62a40316dbb4fa559a8d0345ec0fa22d1886"} Apr 17 14:34:03.880281 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.880255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v95rv" event={"ID":"d058fa6d-b2cb-4092-b4f9-49dee965fff3","Type":"ContainerStarted","Data":"2edde0e03dcf853175591b07a8febe75fcf647a31a337dffb589d3be0c1976cc"} Apr 17 14:34:03.883071 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.883034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" event={"ID":"b0f8e225-290c-4c40-9e1f-2cbdc13a9d2c","Type":"ContainerStarted","Data":"a849d972667132722c412d8fdcfeab64b4b21096fd73e57e7160b37e1e8491d7"} Apr 17 14:34:03.883187 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.883093 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7vwcw" podStartSLOduration=4.08820355 podStartE2EDuration="12.883077453s" podCreationTimestamp="2026-04-17 14:33:51 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.715722896 +0000 UTC m=+3.452950882" lastFinishedPulling="2026-04-17 14:34:03.510596803 +0000 UTC m=+12.247824785" observedRunningTime="2026-04-17 14:34:03.882484513 +0000 UTC m=+12.619712516" watchObservedRunningTime="2026-04-17 14:34:03.883077453 +0000 UTC m=+12.620305446" Apr 17 14:34:03.883491 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.883448 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-180.ec2.internal" podStartSLOduration=10.883410774 podStartE2EDuration="10.883410774s" podCreationTimestamp="2026-04-17 14:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:33:56.870925413 +0000 UTC m=+5.608153414" watchObservedRunningTime="2026-04-17 14:34:03.883410774 +0000 UTC m=+12.620638775" Apr 17 14:34:03.893654 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.893385 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tf94b" podStartSLOduration=4.100040002 podStartE2EDuration="12.893367009s" podCreationTimestamp="2026-04-17 14:33:51 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.717281281 +0000 UTC m=+3.454509266" lastFinishedPulling="2026-04-17 14:34:03.510608289 +0000 UTC m=+12.247836273" observedRunningTime="2026-04-17 14:34:03.892766923 +0000 UTC m=+12.629994925" watchObservedRunningTime="2026-04-17 14:34:03.893367009 +0000 UTC m=+12.630595011" Apr 17 14:34:03.903638 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.903581 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v95rv" podStartSLOduration=4.104418401 podStartE2EDuration="12.903563443s" podCreationTimestamp="2026-04-17 14:33:51 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.711725613 +0000 UTC m=+3.448953597" lastFinishedPulling="2026-04-17 14:34:03.510870654 +0000 UTC m=+12.248098639" observedRunningTime="2026-04-17 14:34:03.903399961 +0000 UTC m=+12.640627963" watchObservedRunningTime="2026-04-17 14:34:03.903563443 +0000 UTC m=+12.640791446" Apr 17 14:34:03.936602 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:03.934233 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jtqxv" podStartSLOduration=3.118560711 podStartE2EDuration="11.934215267s" podCreationTimestamp="2026-04-17 14:33:52 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.710782545 +0000 UTC m=+3.448010530" lastFinishedPulling="2026-04-17 14:34:03.526437099 +0000 UTC m=+12.263665086" observedRunningTime="2026-04-17 14:34:03.931921108 +0000 UTC m=+12.669149108" watchObservedRunningTime="2026-04-17 14:34:03.934215267 +0000 UTC m=+12.671443271" Apr 17 14:34:04.886902 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:04.886863 2577 generic.go:358] "Generic (PLEG): container finished" podID="47a7cca6-6a79-404f-9826-84de279713bb" containerID="3704b0dfc0c2f43f1817347e0ebf48af7d1083a737523644581b2e9974c4f903" exitCode=0 Apr 17 14:34:04.887574 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:04.887026 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerDied","Data":"3704b0dfc0c2f43f1817347e0ebf48af7d1083a737523644581b2e9974c4f903"} Apr 17 14:34:05.796072 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:05.796030 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:05.796267 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:05.796043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:05.796267 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:05.796175 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:05.796372 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:05.796273 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:05.796372 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:05.796322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:05.796463 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:05.796396 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:05.889909 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:05.889864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vfggb" event={"ID":"9d17101b-12ed-4f95-89fd-3e67caa7db02","Type":"ContainerStarted","Data":"0392ef49b32e58131450635b73d1f8998899cec434aea24c0dc649d391f5d30f"} Apr 17 14:34:07.620516 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:07.620242 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:34:07.621149 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:07.621128 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:34:07.633417 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:07.633375 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vfggb" podStartSLOduration=6.826260616 podStartE2EDuration="15.633361272s" podCreationTimestamp="2026-04-17 14:33:52 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.707276867 +0000 UTC m=+3.444504853" lastFinishedPulling="2026-04-17 14:34:03.514377518 +0000 UTC m=+12.251605509" observedRunningTime="2026-04-17 14:34:05.900648325 +0000 UTC m=+14.637876326" watchObservedRunningTime="2026-04-17 14:34:07.633361272 +0000 UTC m=+16.370589274" Apr 17 14:34:07.795791 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:07.795760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:07.795986 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:07.795760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:07.795986 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:07.795884 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:07.795986 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:07.795762 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:07.795986 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:07.795966 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:07.796193 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:07.796009 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:09.574392 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:09.574344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:09.574847 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:09.574437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:09.574847 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.574543 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:09.574847 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.574555 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:09.574847 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.574580 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:09.574847 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.574595 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9j64g for pod openshift-network-diagnostics/network-check-target-v8n74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:09.574847 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.574622 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:25.574603719 +0000 UTC m=+34.311831703 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:09.574847 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.574648 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g podName:5b856db3-62df-499f-acb9-ca566ed0be77 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:25.574630188 +0000 UTC m=+34.311858172 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9j64g" (UniqueName: "kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g") pod "network-check-target-v8n74" (UID: "5b856db3-62df-499f-acb9-ca566ed0be77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:09.795478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:09.795356 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:09.795478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:09.795408 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:09.795717 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.795491 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:09.795717 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.795566 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:09.795717 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:09.795626 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:09.795865 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:09.795714 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:10.179496 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:10.179454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:10.179683 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:10.179632 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:10.179758 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:10.179702 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret podName:8cd56fcc-a291-4dce-ab6e-46a29c54ed43 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:26.179682678 +0000 UTC m=+34.916910658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret") pod "global-pull-secret-syncer-r2ggk" (UID: "8cd56fcc-a291-4dce-ab6e-46a29c54ed43") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:11.796894 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:11.796869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:11.797466 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:11.796970 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:11.797466 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:11.797015 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:11.797466 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:11.797045 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:11.797466 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:11.797072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:11.797466 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:11.797183 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:13.795338 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:13.795299 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:13.795862 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:13.795304 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:13.795862 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:13.795426 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:13.795862 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:13.795506 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:13.795862 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:13.795309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:13.795862 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:13.795627 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:15.153969 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.153943 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:34:15.795597 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.795563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:15.795787 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.795568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:15.795787 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:15.795694 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:15.795787 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.795569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:15.795787 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:15.795776 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:15.796002 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:15.795877 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:15.798353 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.798272 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:34:15.153963675Z","UUID":"34d421b3-4638-4186-b381-0b6e7cdba367","Handler":null,"Name":"","Endpoint":""} Apr 17 14:34:15.801472 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.801454 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:34:15.801590 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.801492 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:34:15.908827 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.908793 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"e9b8b2aff180f53988c6b2e13945850e2165c341e9df0d5a231395f0a070e088"} Apr 17 14:34:15.908827 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.908835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"fd0b0ffd51e16d776195dfdd127452000808c5b0121ee54c19b8ca26c9d212fe"} Apr 17 14:34:15.909063 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.908850 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"21c4bf5b9259ef94808c5c918e103119ce56834b589aa5cb56bd11cfe13f5a33"} Apr 17 14:34:15.909063 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.908863 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"c9b3dc731721bebe6f696877bc746908799e260b64105a96d3c5082af00e41f0"} Apr 17 14:34:15.909063 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.908875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"c844a687d185268a320b71801a759d823740cf81aed54158abe651a9d7f4e268"} Apr 17 14:34:15.909063 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.908886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"16a63cc9d8fdb95c61d0e7308d571720f37b1f0f3affc04e6bcca04e48ab1b77"} Apr 17 14:34:15.910100 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.910066 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sr7t4" event={"ID":"1625a2e5-b33d-4e5d-85f4-286dd067f854","Type":"ContainerStarted","Data":"f6b2f32e6e195c867e72d530915120ce495d90654ed7ed6efc726ec5742aaa81"} Apr 17 14:34:15.911553 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.911521 2577 generic.go:358] "Generic (PLEG): container finished" podID="47a7cca6-6a79-404f-9826-84de279713bb" containerID="4ec6dddbdc90fc699e9a1d8d601427b8dea50479cdb9f8c4ac26fe89bc575e13" exitCode=0 Apr 17 14:34:15.911673 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.911553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerDied","Data":"4ec6dddbdc90fc699e9a1d8d601427b8dea50479cdb9f8c4ac26fe89bc575e13"} Apr 17 14:34:15.913213 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.913191 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" event={"ID":"622b093f-2902-4bca-b6a5-3cad2c516a5a","Type":"ContainerStarted","Data":"610f2bbd011f40f62045cab30ca48a45e4a645c99b4ff9019d214ebd4cbee8e2"} Apr 17 14:34:15.926044 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:15.926005 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sr7t4" podStartSLOduration=3.670737391 podStartE2EDuration="23.92599208s" podCreationTimestamp="2026-04-17 14:33:52 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.71385021 +0000 UTC m=+3.451078191" lastFinishedPulling="2026-04-17 14:34:14.969104898 +0000 UTC m=+23.706332880" observedRunningTime="2026-04-17 14:34:15.925403617 +0000 UTC m=+24.662631618" watchObservedRunningTime="2026-04-17 14:34:15.92599208 +0000 UTC m=+24.663220114" Apr 17 14:34:16.917411 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:16.917326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" event={"ID":"622b093f-2902-4bca-b6a5-3cad2c516a5a","Type":"ContainerStarted","Data":"79df2e2ed6d4e7ba2ad2eec0a61c71aaadcf0707be5845dc1ccc65a83f57928e"} Apr 17 14:34:16.934708 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:16.934649 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22gz4" podStartSLOduration=3.069655461 podStartE2EDuration="24.934633167s" podCreationTimestamp="2026-04-17 14:33:52 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.706241621 +0000 UTC m=+3.443469600" lastFinishedPulling="2026-04-17 14:34:16.571219327 +0000 UTC m=+25.308447306" observedRunningTime="2026-04-17 14:34:16.933499231 +0000 UTC m=+25.670727232" watchObservedRunningTime="2026-04-17 14:34:16.934633167 +0000 UTC m=+25.671861169" Apr 17 14:34:17.799347 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:17.799320 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:17.799493 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:17.799331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:17.799493 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:17.799413 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:17.799493 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:17.799431 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:17.799647 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:17.799510 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:17.799647 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:17.799581 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:17.922400 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:17.922357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"82f556bad6f91813830bb58cb22ec443d2b7096e6673146c851530b28b455df5"} Apr 17 14:34:17.924056 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:17.924033 2577 generic.go:358] "Generic (PLEG): container finished" podID="47a7cca6-6a79-404f-9826-84de279713bb" containerID="a622bcbbb89bd49161dc9b0f4a68760d441b7e1a9261deeb767bba64cc97aa87" exitCode=0 Apr 17 14:34:17.924181 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:17.924111 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerDied","Data":"a622bcbbb89bd49161dc9b0f4a68760d441b7e1a9261deeb767bba64cc97aa87"} Apr 17 14:34:18.928416 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:18.928378 2577 generic.go:358] "Generic (PLEG): container finished" podID="47a7cca6-6a79-404f-9826-84de279713bb" containerID="b4a52b9a4d4a5e3a28e9f3348a77538988807704522661f1ae7be807aaf1e519" exitCode=0 Apr 17 14:34:18.928840 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:18.928432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerDied","Data":"b4a52b9a4d4a5e3a28e9f3348a77538988807704522661f1ae7be807aaf1e519"} Apr 17 14:34:19.796013 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:19.795974 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:19.796200 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:19.796091 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:19.796200 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:19.796097 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:19.796331 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:19.796219 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:19.796331 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:19.796266 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:19.796331 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:19.796311 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:19.934846 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:19.934578 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" event={"ID":"96429a18-321a-48f1-8136-1ce5473e98ee","Type":"ContainerStarted","Data":"635ea5fb1a72080c63e6b617da37db1fbed2ce591ab9f7876d8925e7382da95b"} Apr 17 14:34:19.935319 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:19.934870 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:34:19.935319 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:19.934889 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:34:19.953254 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:19.953064 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:34:19.963506 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:19.963401 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" podStartSLOduration=7.710527265 podStartE2EDuration="27.963382811s" podCreationTimestamp="2026-04-17 14:33:52 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.716635212 +0000 UTC m=+3.453863191" lastFinishedPulling="2026-04-17 14:34:14.969490755 +0000 UTC m=+23.706718737" observedRunningTime="2026-04-17 14:34:19.96278025 +0000 UTC m=+28.700008252" watchObservedRunningTime="2026-04-17 14:34:19.963382811 +0000 UTC m=+28.700610814" Apr 17 14:34:20.874966 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:20.874928 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:34:20.875204 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:20.875087 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:34:20.875765 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:20.875733 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7vwcw" Apr 17 14:34:20.937056 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:20.937026 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:34:20.955342 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:20.955312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:34:21.796508 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.796476 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:21.796662 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:21.796562 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:21.796662 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.796652 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:21.796781 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.796768 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:21.796817 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:21.796786 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:21.796858 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:21.796843 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:21.842524 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.842443 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r2ggk"] Apr 17 14:34:21.845174 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.845130 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v8n74"] Apr 17 14:34:21.854805 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.854769 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4nvfl"] Apr 17 14:34:21.938822 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.938735 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:21.938822 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.938771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:21.939278 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:21.938845 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:21.939278 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:21.938744 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:21.939278 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:21.938949 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:21.939278 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:21.939089 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:23.796172 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:23.796128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:23.796621 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:23.796584 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:23.796682 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:23.796628 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:23.796825 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:23.796800 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:23.796916 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:23.796864 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:23.797144 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:23.797041 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:25.600748 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:25.600492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:25.601425 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:25.600803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:25.601425 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.600643 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:25.601425 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.600866 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:25.601425 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.600886 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9j64g for pod openshift-network-diagnostics/network-check-target-v8n74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:25.601425 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.600935 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:25.601425 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.600973 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g podName:5b856db3-62df-499f-acb9-ca566ed0be77 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:57.600952183 +0000 UTC m=+66.338180185 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9j64g" (UniqueName: "kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g") pod "network-check-target-v8n74" (UID: "5b856db3-62df-499f-acb9-ca566ed0be77") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:25.601425 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.600997 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:57.600985278 +0000 UTC m=+66.338213258 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:25.795662 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:25.795626 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:25.795854 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.795768 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:25.795854 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:25.795626 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:25.795966 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.795864 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:25.795966 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:25.795626 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:25.795966 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:25.795947 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:26.205422 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:26.205371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:26.205606 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:26.205554 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:26.205663 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:26.205618 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret podName:8cd56fcc-a291-4dce-ab6e-46a29c54ed43 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:58.205599314 +0000 UTC m=+66.942827326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret") pod "global-pull-secret-syncer-r2ggk" (UID: "8cd56fcc-a291-4dce-ab6e-46a29c54ed43") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:27.795969 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:27.795927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:27.796617 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:27.795927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:27.796617 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:27.796054 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8n74" podUID="5b856db3-62df-499f-acb9-ca566ed0be77" Apr 17 14:34:27.796617 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:27.796112 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r2ggk" podUID="8cd56fcc-a291-4dce-ab6e-46a29c54ed43" Apr 17 14:34:27.796617 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:27.795927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:27.796617 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:27.796253 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:34:27.955583 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:27.955550 2577 generic.go:358] "Generic (PLEG): container finished" podID="47a7cca6-6a79-404f-9826-84de279713bb" containerID="2f5742123b1fa47f7f553a74f06e7406ca656c5de468486accc1243380a0ad9b" exitCode=0 Apr 17 14:34:27.955754 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:27.955600 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerDied","Data":"2f5742123b1fa47f7f553a74f06e7406ca656c5de468486accc1243380a0ad9b"} Apr 17 14:34:28.622705 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.622678 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-180.ec2.internal" event="NodeReady" Apr 17 14:34:28.622864 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.622802 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:34:28.661305 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.661236 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t99cr"] Apr 17 14:34:28.686238 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.686212 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r25f2"] Apr 17 14:34:28.686388 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.686376 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.688624 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.688593 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:34:28.688624 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.688594 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:34:28.688825 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.688632 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fr4dp\"" Apr 17 14:34:28.708006 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.707984 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t99cr"] Apr 17 14:34:28.708006 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.708008 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r25f2"] Apr 17 14:34:28.708198 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.708102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:28.710262 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.710240 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:34:28.710353 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.710264 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dtcwn\"" Apr 17 14:34:28.710555 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.710539 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:34:28.711039 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.711023 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:34:28.823394 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.823346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmnp7\" (UniqueName: \"kubernetes.io/projected/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-kube-api-access-zmnp7\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.823777 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.823409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-tmp-dir\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.823777 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.823449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-config-volume\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.823777 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.823493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.823777 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.823546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r545w\" (UniqueName: \"kubernetes.io/projected/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-kube-api-access-r545w\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:28.823777 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.823591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:28.924794 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.924696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.924794 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.924741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r545w\" (UniqueName: \"kubernetes.io/projected/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-kube-api-access-r545w\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:28.924794 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.924767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:28.924794 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.924794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmnp7\" (UniqueName: \"kubernetes.io/projected/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-kube-api-access-zmnp7\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.925112 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.924827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-tmp-dir\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.925112 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.924845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-config-volume\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.925112 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:28.924875 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:28.925112 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:28.924962 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls podName:fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:29.424938362 +0000 UTC m=+38.162166342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls") pod "dns-default-t99cr" (UID: "fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0") : secret "dns-default-metrics-tls" not found Apr 17 14:34:28.925324 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.925253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-tmp-dir\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.925324 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:28.924875 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:28.925387 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:28.925369 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert podName:df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed nodeName:}" failed. No retries permitted until 2026-04-17 14:34:29.425351051 +0000 UTC m=+38.162579050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert") pod "ingress-canary-r25f2" (UID: "df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed") : secret "canary-serving-cert" not found Apr 17 14:34:28.925505 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.925489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-config-volume\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.933576 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.933552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmnp7\" (UniqueName: \"kubernetes.io/projected/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-kube-api-access-zmnp7\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:28.933853 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.933833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r545w\" (UniqueName: \"kubernetes.io/projected/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-kube-api-access-r545w\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:28.960701 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.960665 2577 generic.go:358] "Generic (PLEG): container finished" podID="47a7cca6-6a79-404f-9826-84de279713bb" containerID="fb25378145c5babc0051f34001440575527ec69e09dee6fae75e9a54fbedf1ec" exitCode=0 Apr 17 14:34:28.960859 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:28.960712 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerDied","Data":"fb25378145c5babc0051f34001440575527ec69e09dee6fae75e9a54fbedf1ec"} Apr 17 14:34:29.429804 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.429768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:29.429996 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.429830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:29.429996 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:29.429921 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:29.429996 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:29.429958 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:29.429996 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:29.429984 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls podName:fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:30.429968515 +0000 UTC m=+39.167196495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls") pod "dns-default-t99cr" (UID: "fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0") : secret "dns-default-metrics-tls" not found Apr 17 14:34:29.430202 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:29.430008 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert podName:df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed nodeName:}" failed. No retries permitted until 2026-04-17 14:34:30.429992665 +0000 UTC m=+39.167220649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert") pod "ingress-canary-r25f2" (UID: "df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed") : secret "canary-serving-cert" not found Apr 17 14:34:29.798109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.798079 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:29.798283 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.798084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:29.798283 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.798084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:29.800750 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.800728 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:34:29.800867 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.800767 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tg92k\"" Apr 17 14:34:29.800996 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.800982 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:34:29.801433 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.801419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:34:29.801526 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.801457 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mnbtt\"" Apr 17 14:34:29.801526 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.801473 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:34:29.965137 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.965103 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49c8v" event={"ID":"47a7cca6-6a79-404f-9826-84de279713bb","Type":"ContainerStarted","Data":"54ebfd11f7e1fb0acab4fbc5ffea2f347a98276183702800f8b0729b7ec833de"} Apr 17 14:34:29.984999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:29.984937 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-49c8v" podStartSLOduration=5.6992453130000005 podStartE2EDuration="37.984920298s" podCreationTimestamp="2026-04-17 14:33:52 +0000 UTC" firstStartedPulling="2026-04-17 14:33:54.712665013 +0000 UTC m=+3.449892993" lastFinishedPulling="2026-04-17 14:34:26.998339993 +0000 UTC m=+35.735567978" observedRunningTime="2026-04-17 14:34:29.983781593 +0000 UTC m=+38.721009591" watchObservedRunningTime="2026-04-17 14:34:29.984920298 +0000 UTC m=+38.722148298" Apr 17 14:34:30.438452 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.438416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:30.438624 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.438466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:30.438624 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:30.438565 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:30.438624 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:30.438577 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:30.438624 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:30.438616 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert podName:df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed nodeName:}" failed. No retries permitted until 2026-04-17 14:34:32.438602079 +0000 UTC m=+41.175830058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert") pod "ingress-canary-r25f2" (UID: "df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed") : secret "canary-serving-cert" not found Apr 17 14:34:30.438753 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:30.438629 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls podName:fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:32.438622918 +0000 UTC m=+41.175850896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls") pod "dns-default-t99cr" (UID: "fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0") : secret "dns-default-metrics-tls" not found Apr 17 14:34:30.634540 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.634505 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9"] Apr 17 14:34:30.660462 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.660428 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9"] Apr 17 14:34:30.660597 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.660543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" Apr 17 14:34:30.662747 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.662721 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 14:34:30.662888 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.662829 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 14:34:30.662888 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.662866 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 14:34:30.663536 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.663514 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 14:34:30.663536 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.663530 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-hkbzc\"" Apr 17 14:34:30.670360 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.670337 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw"] Apr 17 14:34:30.681663 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.681632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.682421 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.682399 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw"] Apr 17 14:34:30.683640 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.683619 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 14:34:30.740736 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.740696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zg6\" (UniqueName: \"kubernetes.io/projected/0237de7d-a2e2-4924-bc12-962079df4f96-kube-api-access-w5zg6\") pod \"managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9\" (UID: \"0237de7d-a2e2-4924-bc12-962079df4f96\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" Apr 17 14:34:30.740909 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.740777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0237de7d-a2e2-4924-bc12-962079df4f96-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9\" (UID: \"0237de7d-a2e2-4924-bc12-962079df4f96\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" Apr 17 14:34:30.841117 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.841084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zg6\" (UniqueName: \"kubernetes.io/projected/0237de7d-a2e2-4924-bc12-962079df4f96-kube-api-access-w5zg6\") pod \"managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9\" (UID: \"0237de7d-a2e2-4924-bc12-962079df4f96\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" Apr 17 14:34:30.841222 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.841142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4bnf\" (UniqueName: \"kubernetes.io/projected/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-kube-api-access-d4bnf\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.841222 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.841189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0237de7d-a2e2-4924-bc12-962079df4f96-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9\" (UID: \"0237de7d-a2e2-4924-bc12-962079df4f96\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" Apr 17 14:34:30.841222 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.841206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-tmp\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.841333 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.841231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.843839 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.843811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0237de7d-a2e2-4924-bc12-962079df4f96-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9\" (UID: \"0237de7d-a2e2-4924-bc12-962079df4f96\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" Apr 17 14:34:30.848470 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.848446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zg6\" (UniqueName: \"kubernetes.io/projected/0237de7d-a2e2-4924-bc12-962079df4f96-kube-api-access-w5zg6\") pod \"managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9\" (UID: \"0237de7d-a2e2-4924-bc12-962079df4f96\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" Apr 17 14:34:30.941716 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.941633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-tmp\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.941716 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.941680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.941939 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.941768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4bnf\" (UniqueName: \"kubernetes.io/projected/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-kube-api-access-d4bnf\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.942081 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.942056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-tmp\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.944057 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.944032 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.950056 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.950033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4bnf\" (UniqueName: \"kubernetes.io/projected/9381de67-e5f0-4a19-bd6a-1041cf6f0c8b-kube-api-access-d4bnf\") pod \"klusterlet-addon-workmgr-7c4b667cb5-pg4vw\" (UID: \"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:30.978059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.978027 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" Apr 17 14:34:30.992025 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:30.991841 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:31.142368 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:31.142334 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9"] Apr 17 14:34:31.145300 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:31.145264 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw"] Apr 17 14:34:31.146514 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:34:31.146490 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0237de7d_a2e2_4924_bc12_962079df4f96.slice/crio-fb890b98a62719c87bd1cef81d0f72d17c1d7fbf7204a9cceebc55ce32abf739 WatchSource:0}: Error finding container fb890b98a62719c87bd1cef81d0f72d17c1d7fbf7204a9cceebc55ce32abf739: Status 404 returned error can't find the container with id fb890b98a62719c87bd1cef81d0f72d17c1d7fbf7204a9cceebc55ce32abf739 Apr 17 14:34:31.148986 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:34:31.148950 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9381de67_e5f0_4a19_bd6a_1041cf6f0c8b.slice/crio-b453a684d0b8fb8b4757c93671e329164efcded0ff8ae3651f7d7dbc28f54493 WatchSource:0}: Error finding container b453a684d0b8fb8b4757c93671e329164efcded0ff8ae3651f7d7dbc28f54493: Status 404 returned error can't find the container with id b453a684d0b8fb8b4757c93671e329164efcded0ff8ae3651f7d7dbc28f54493 Apr 17 14:34:31.970511 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:31.970475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" event={"ID":"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b","Type":"ContainerStarted","Data":"b453a684d0b8fb8b4757c93671e329164efcded0ff8ae3651f7d7dbc28f54493"} Apr 17 14:34:31.971794 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:31.971763 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" event={"ID":"0237de7d-a2e2-4924-bc12-962079df4f96","Type":"ContainerStarted","Data":"fb890b98a62719c87bd1cef81d0f72d17c1d7fbf7204a9cceebc55ce32abf739"} Apr 17 14:34:32.455416 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:32.455375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:32.455988 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:32.455466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:32.455988 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:32.455546 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:32.455988 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:32.455605 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:32.455988 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:32.455629 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls podName:fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:36.455605558 +0000 UTC m=+45.192833539 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls") pod "dns-default-t99cr" (UID: "fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0") : secret "dns-default-metrics-tls" not found Apr 17 14:34:32.455988 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:32.455659 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert podName:df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed nodeName:}" failed. No retries permitted until 2026-04-17 14:34:36.455642528 +0000 UTC m=+45.192870524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert") pod "ingress-canary-r25f2" (UID: "df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed") : secret "canary-serving-cert" not found Apr 17 14:34:36.488613 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:36.488408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:36.489175 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:36.488666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:36.489175 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:36.488578 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:36.489175 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:36.488733 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert podName:df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed nodeName:}" failed. No retries permitted until 2026-04-17 14:34:44.488717725 +0000 UTC m=+53.225945704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert") pod "ingress-canary-r25f2" (UID: "df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed") : secret "canary-serving-cert" not found Apr 17 14:34:36.489175 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:36.488747 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:36.489175 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:36.488786 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls podName:fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:44.488775764 +0000 UTC m=+53.226003742 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls") pod "dns-default-t99cr" (UID: "fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0") : secret "dns-default-metrics-tls" not found Apr 17 14:34:36.984813 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:36.984775 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" event={"ID":"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b","Type":"ContainerStarted","Data":"80b0d711a7dbcf76c3855027de7d2812a31a9a4d0f0574168216a71de376a568"} Apr 17 14:34:36.985093 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:36.984934 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:36.986214 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:36.986189 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" event={"ID":"0237de7d-a2e2-4924-bc12-962079df4f96","Type":"ContainerStarted","Data":"677a82b4c3016a1729d191a3a336cdb42ad4a924c534d583daf6c1ed06fe87bb"} Apr 17 14:34:36.987007 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:36.986989 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:34:36.999822 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:36.999765 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" podStartSLOduration=2.2421181199999998 podStartE2EDuration="6.999752123s" podCreationTimestamp="2026-04-17 14:34:30 +0000 UTC" firstStartedPulling="2026-04-17 14:34:31.150628436 +0000 UTC m=+39.887856416" lastFinishedPulling="2026-04-17 14:34:35.908262437 +0000 UTC m=+44.645490419" observedRunningTime="2026-04-17 14:34:36.998270732 +0000 UTC m=+45.735498862" watchObservedRunningTime="2026-04-17 14:34:36.999752123 +0000 UTC m=+45.736980179" Apr 17 14:34:37.023810 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:37.023752 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" podStartSLOduration=2.27736969 podStartE2EDuration="7.023733478s" podCreationTimestamp="2026-04-17 14:34:30 +0000 UTC" firstStartedPulling="2026-04-17 14:34:31.148632091 +0000 UTC m=+39.885860086" lastFinishedPulling="2026-04-17 14:34:35.894995895 +0000 UTC m=+44.632223874" observedRunningTime="2026-04-17 14:34:37.023195293 +0000 UTC m=+45.760423295" watchObservedRunningTime="2026-04-17 14:34:37.023733478 +0000 UTC m=+45.760961480" Apr 17 14:34:44.547346 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:44.547300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:34:44.547913 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:44.547358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:34:44.547913 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:44.547469 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:44.547913 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:44.547484 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:44.547913 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:44.547523 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert podName:df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed nodeName:}" failed. No retries permitted until 2026-04-17 14:35:00.547509388 +0000 UTC m=+69.284737371 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert") pod "ingress-canary-r25f2" (UID: "df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed") : secret "canary-serving-cert" not found Apr 17 14:34:44.547913 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:44.547557 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls podName:fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0 nodeName:}" failed. No retries permitted until 2026-04-17 14:35:00.547536618 +0000 UTC m=+69.284764597 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls") pod "dns-default-t99cr" (UID: "fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0") : secret "dns-default-metrics-tls" not found Apr 17 14:34:52.951916 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:52.951875 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbxb9" Apr 17 14:34:57.644972 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.644917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:57.645467 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.644988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:34:57.647528 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.647506 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:34:57.647626 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.647583 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:34:57.655282 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:57.655261 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:34:57.655371 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:34:57.655335 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:01.655318078 +0000 UTC m=+130.392546056 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : secret "metrics-daemon-secret" not found Apr 17 14:34:57.657019 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.657006 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:34:57.669861 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.669836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j64g\" (UniqueName: \"kubernetes.io/projected/5b856db3-62df-499f-acb9-ca566ed0be77-kube-api-access-9j64g\") pod \"network-check-target-v8n74\" (UID: \"5b856db3-62df-499f-acb9-ca566ed0be77\") " pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:57.715603 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.715575 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mnbtt\"" Apr 17 14:34:57.723559 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.723538 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:34:57.837608 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:57.837575 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v8n74"] Apr 17 14:34:57.840836 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:34:57.840802 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b856db3_62df_499f_acb9_ca566ed0be77.slice/crio-e446766329b7224344e7ee23b21f87769abcfbba0cd22b06b23c4758ada70023 WatchSource:0}: Error finding container e446766329b7224344e7ee23b21f87769abcfbba0cd22b06b23c4758ada70023: Status 404 returned error can't find the container with id e446766329b7224344e7ee23b21f87769abcfbba0cd22b06b23c4758ada70023 Apr 17 14:34:58.028994 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:58.028961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v8n74" event={"ID":"5b856db3-62df-499f-acb9-ca566ed0be77","Type":"ContainerStarted","Data":"e446766329b7224344e7ee23b21f87769abcfbba0cd22b06b23c4758ada70023"} Apr 17 14:34:58.249565 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:58.249526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:58.251943 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:58.251925 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:34:58.261991 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:58.261967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8cd56fcc-a291-4dce-ab6e-46a29c54ed43-original-pull-secret\") pod \"global-pull-secret-syncer-r2ggk\" (UID: \"8cd56fcc-a291-4dce-ab6e-46a29c54ed43\") " pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:58.308136 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:58.308056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r2ggk" Apr 17 14:34:58.422223 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:58.422191 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r2ggk"] Apr 17 14:34:58.425410 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:34:58.425382 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd56fcc_a291_4dce_ab6e_46a29c54ed43.slice/crio-e17584496d1fe6a4fd1f4e1f8b72c896024516ea647901af70d7efd5185748d7 WatchSource:0}: Error finding container e17584496d1fe6a4fd1f4e1f8b72c896024516ea647901af70d7efd5185748d7: Status 404 returned error can't find the container with id e17584496d1fe6a4fd1f4e1f8b72c896024516ea647901af70d7efd5185748d7 Apr 17 14:34:59.032722 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:34:59.032675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r2ggk" event={"ID":"8cd56fcc-a291-4dce-ab6e-46a29c54ed43","Type":"ContainerStarted","Data":"e17584496d1fe6a4fd1f4e1f8b72c896024516ea647901af70d7efd5185748d7"} Apr 17 14:35:00.566565 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:00.566523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:35:00.567012 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:00.566586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:35:00.567012 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:35:00.566716 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:35:00.567012 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:35:00.566723 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:35:00.567012 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:35:00.566817 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls podName:fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0 nodeName:}" failed. No retries permitted until 2026-04-17 14:35:32.566792789 +0000 UTC m=+101.304020781 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls") pod "dns-default-t99cr" (UID: "fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0") : secret "dns-default-metrics-tls" not found Apr 17 14:35:00.567012 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:35:00.566893 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert podName:df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed nodeName:}" failed. No retries permitted until 2026-04-17 14:35:32.566866577 +0000 UTC m=+101.304094562 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert") pod "ingress-canary-r25f2" (UID: "df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed") : secret "canary-serving-cert" not found Apr 17 14:35:03.043664 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:03.043620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v8n74" event={"ID":"5b856db3-62df-499f-acb9-ca566ed0be77","Type":"ContainerStarted","Data":"efc4e20ddc0058c1d138a3790ccb55d8b296e9804c36182cde902d268bdfe4bf"} Apr 17 14:35:03.044140 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:03.043760 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:35:03.044825 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:03.044808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r2ggk" event={"ID":"8cd56fcc-a291-4dce-ab6e-46a29c54ed43","Type":"ContainerStarted","Data":"678c32d4a93182a21465ff0493e34644edfdf3032f17f5467ccaaa3eadb2f074"} Apr 17 14:35:03.058252 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:03.058210 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-v8n74" podStartSLOduration=67.443008217 podStartE2EDuration="1m12.058197191s" podCreationTimestamp="2026-04-17 14:33:51 +0000 UTC" firstStartedPulling="2026-04-17 14:34:57.842697122 +0000 UTC m=+66.579925101" lastFinishedPulling="2026-04-17 14:35:02.457886093 +0000 UTC m=+71.195114075" observedRunningTime="2026-04-17 14:35:03.056835345 +0000 UTC m=+71.794063346" watchObservedRunningTime="2026-04-17 14:35:03.058197191 +0000 UTC m=+71.795425192" Apr 17 14:35:03.071193 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:03.071128 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-r2ggk" podStartSLOduration=65.035495591 podStartE2EDuration="1m9.071117861s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:34:58.427067461 +0000 UTC m=+67.164295440" lastFinishedPulling="2026-04-17 14:35:02.462689732 +0000 UTC m=+71.199917710" observedRunningTime="2026-04-17 14:35:03.070105452 +0000 UTC m=+71.807333454" watchObservedRunningTime="2026-04-17 14:35:03.071117861 +0000 UTC m=+71.808345855" Apr 17 14:35:32.596548 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:32.596493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:35:32.596983 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:32.596594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:35:32.596983 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:35:32.596660 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:35:32.596983 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:35:32.596680 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:35:32.596983 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:35:32.596739 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert podName:df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed nodeName:}" failed. No retries permitted until 2026-04-17 14:36:36.596717367 +0000 UTC m=+165.333945347 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert") pod "ingress-canary-r25f2" (UID: "df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed") : secret "canary-serving-cert" not found Apr 17 14:35:32.596983 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:35:32.596758 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls podName:fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:36.596749779 +0000 UTC m=+165.333977761 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls") pod "dns-default-t99cr" (UID: "fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0") : secret "dns-default-metrics-tls" not found Apr 17 14:35:34.050045 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:35:34.050009 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-v8n74" Apr 17 14:36:01.696347 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:01.696295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:36:01.696838 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:01.696445 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:36:01.696838 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:01.696554 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs podName:a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87 nodeName:}" failed. No retries permitted until 2026-04-17 14:38:03.696535005 +0000 UTC m=+252.433762995 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs") pod "network-metrics-daemon-4nvfl" (UID: "a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87") : secret "metrics-daemon-secret" not found Apr 17 14:36:16.493874 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.493837 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh"] Apr 17 14:36:16.496542 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.496526 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.498583 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.498547 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 14:36:16.498709 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.498618 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 14:36:16.498709 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.498626 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-kqgdw\"" Apr 17 14:36:16.499485 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.499465 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:36:16.499549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.499488 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 14:36:16.503023 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.503001 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6d757c5ffb-lfbbq"] Apr 17 14:36:16.505972 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.505951 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh"] Apr 17 14:36:16.506070 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.506054 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.508078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.508059 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 14:36:16.508310 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.508289 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 14:36:16.508413 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.508310 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 14:36:16.508413 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.508315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 14:36:16.508521 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.508475 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-f6626\"" Apr 17 14:36:16.508678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.508664 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 14:36:16.508723 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.508686 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 14:36:16.519390 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.519366 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6d757c5ffb-lfbbq"] Apr 17 14:36:16.596099 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.596060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv484\" (UniqueName: \"kubernetes.io/projected/3e50beca-69e6-4881-82d9-7f37412e8889-kube-api-access-lv484\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.596099 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.596101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e50beca-69e6-4881-82d9-7f37412e8889-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.596339 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.596120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e50beca-69e6-4881-82d9-7f37412e8889-config\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.697109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv484\" (UniqueName: \"kubernetes.io/projected/3e50beca-69e6-4881-82d9-7f37412e8889-kube-api-access-lv484\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.697109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.697393 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e50beca-69e6-4881-82d9-7f37412e8889-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.697393 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697188 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e50beca-69e6-4881-82d9-7f37412e8889-config\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.697393 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.697393 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fljd\" (UniqueName: \"kubernetes.io/projected/f6c83797-bcd2-4b74-a591-ecde3243182e-kube-api-access-7fljd\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.697393 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-default-certificate\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.697393 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-stats-auth\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.697817 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.697795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e50beca-69e6-4881-82d9-7f37412e8889-config\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.699466 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.699448 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e50beca-69e6-4881-82d9-7f37412e8889-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.705740 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.705714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv484\" (UniqueName: \"kubernetes.io/projected/3e50beca-69e6-4881-82d9-7f37412e8889-kube-api-access-lv484\") pod \"service-ca-operator-d6fc45fc5-mf4lh\" (UID: \"3e50beca-69e6-4881-82d9-7f37412e8889\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.797906 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.797878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.798062 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.797933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fljd\" (UniqueName: \"kubernetes.io/projected/f6c83797-bcd2-4b74-a591-ecde3243182e-kube-api-access-7fljd\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.798062 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:16.798034 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:36:16.798140 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.798056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-default-certificate\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.798140 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.798094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-stats-auth\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.798140 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:16.798111 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:17.29809255 +0000 UTC m=+146.035320529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : secret "router-metrics-certs-default" not found Apr 17 14:36:16.798333 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.798239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.798378 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:16.798346 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:17.298336556 +0000 UTC m=+146.035564535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : configmap references non-existent config key: service-ca.crt Apr 17 14:36:16.800637 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.800611 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-stats-auth\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.800736 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.800622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-default-certificate\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.806386 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.806366 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" Apr 17 14:36:16.806683 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.806667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fljd\" (UniqueName: \"kubernetes.io/projected/f6c83797-bcd2-4b74-a591-ecde3243182e-kube-api-access-7fljd\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:16.923972 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:16.923940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh"] Apr 17 14:36:16.927166 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:36:16.927128 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e50beca_69e6_4881_82d9_7f37412e8889.slice/crio-dde5b7c5140466cdf0c30a69a728f2f9d897f88c2dd44e39df39b0f55a67a225 WatchSource:0}: Error finding container dde5b7c5140466cdf0c30a69a728f2f9d897f88c2dd44e39df39b0f55a67a225: Status 404 returned error can't find the container with id dde5b7c5140466cdf0c30a69a728f2f9d897f88c2dd44e39df39b0f55a67a225 Apr 17 14:36:17.186403 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:17.186306 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" event={"ID":"3e50beca-69e6-4881-82d9-7f37412e8889","Type":"ContainerStarted","Data":"dde5b7c5140466cdf0c30a69a728f2f9d897f88c2dd44e39df39b0f55a67a225"} Apr 17 14:36:17.303018 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:17.302979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:17.303018 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:17.303024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:17.303305 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:17.303130 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:36:17.303305 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:17.303195 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:18.303168889 +0000 UTC m=+147.040396882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : configmap references non-existent config key: service-ca.crt Apr 17 14:36:17.303305 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:17.303223 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:18.303213523 +0000 UTC m=+147.040441503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : secret "router-metrics-certs-default" not found Apr 17 14:36:18.311564 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:18.311517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:18.311983 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:18.311648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:18.311983 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:18.311672 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:36:18.311983 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:18.311736 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:20.311718256 +0000 UTC m=+149.048946252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : secret "router-metrics-certs-default" not found Apr 17 14:36:18.311983 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:18.311753 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:20.311744539 +0000 UTC m=+149.048972518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : configmap references non-existent config key: service-ca.crt Apr 17 14:36:20.194246 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:20.194191 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" event={"ID":"3e50beca-69e6-4881-82d9-7f37412e8889","Type":"ContainerStarted","Data":"333849502cb4a5b3df01e46d475af9a6741a999ec277095784a58eea5c68ce9e"} Apr 17 14:36:20.208400 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:20.208348 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" podStartSLOduration=1.895667516 podStartE2EDuration="4.208334907s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="2026-04-17 14:36:16.928933041 +0000 UTC m=+145.666161019" lastFinishedPulling="2026-04-17 14:36:19.241600427 +0000 UTC m=+147.978828410" observedRunningTime="2026-04-17 14:36:20.207348594 +0000 UTC m=+148.944576595" watchObservedRunningTime="2026-04-17 14:36:20.208334907 +0000 UTC m=+148.945562908" Apr 17 14:36:20.329748 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:20.329704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:20.329748 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:20.329757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:20.329954 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:20.329856 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:36:20.329954 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:20.329908 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:24.329894097 +0000 UTC m=+153.067122076 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : secret "router-metrics-certs-default" not found Apr 17 14:36:20.329954 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:20.329921 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:24.329915921 +0000 UTC m=+153.067143899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : configmap references non-existent config key: service-ca.crt Apr 17 14:36:23.416549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:23.416521 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v95rv_d058fa6d-b2cb-4092-b4f9-49dee965fff3/dns-node-resolver/0.log" Apr 17 14:36:24.360907 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:24.360868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:24.361058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:24.360961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:24.361058 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:24.361029 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:36:24.361172 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:24.361062 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:32.361048747 +0000 UTC m=+161.098276726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : configmap references non-existent config key: service-ca.crt Apr 17 14:36:24.361172 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:24.361088 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:32.361073976 +0000 UTC m=+161.098301954 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : secret "router-metrics-certs-default" not found Apr 17 14:36:24.417732 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:24.417706 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tf94b_b72db081-842b-4ea0-a24a-f1f7916d7ee5/node-ca/0.log" Apr 17 14:36:31.694786 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:31.694693 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-t99cr" podUID="fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0" Apr 17 14:36:31.716108 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:31.716082 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-r25f2" podUID="df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed" Apr 17 14:36:32.220081 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:32.220051 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t99cr" Apr 17 14:36:32.423597 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:32.423554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:32.423597 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:32.423603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:32.423803 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:32.423738 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle podName:f6c83797-bcd2-4b74-a591-ecde3243182e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:48.423717694 +0000 UTC m=+177.160945673 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle") pod "router-default-6d757c5ffb-lfbbq" (UID: "f6c83797-bcd2-4b74-a591-ecde3243182e") : configmap references non-existent config key: service-ca.crt Apr 17 14:36:32.426055 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:32.426029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6c83797-bcd2-4b74-a591-ecde3243182e-metrics-certs\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:32.818920 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:36:32.818874 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4nvfl" podUID="a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87" Apr 17 14:36:36.229368 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.229335 2577 generic.go:358] "Generic (PLEG): container finished" podID="0237de7d-a2e2-4924-bc12-962079df4f96" containerID="677a82b4c3016a1729d191a3a336cdb42ad4a924c534d583daf6c1ed06fe87bb" exitCode=255 Apr 17 14:36:36.229733 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.229406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" event={"ID":"0237de7d-a2e2-4924-bc12-962079df4f96","Type":"ContainerDied","Data":"677a82b4c3016a1729d191a3a336cdb42ad4a924c534d583daf6c1ed06fe87bb"} Apr 17 14:36:36.230750 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.230728 2577 generic.go:358] "Generic (PLEG): container finished" podID="9381de67-e5f0-4a19-bd6a-1041cf6f0c8b" containerID="80b0d711a7dbcf76c3855027de7d2812a31a9a4d0f0574168216a71de376a568" exitCode=1 Apr 17 14:36:36.230850 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.230759 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" event={"ID":"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b","Type":"ContainerDied","Data":"80b0d711a7dbcf76c3855027de7d2812a31a9a4d0f0574168216a71de376a568"} Apr 17 14:36:36.231036 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.231024 2577 scope.go:117] "RemoveContainer" containerID="80b0d711a7dbcf76c3855027de7d2812a31a9a4d0f0574168216a71de376a568" Apr 17 14:36:36.235312 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.235296 2577 scope.go:117] "RemoveContainer" containerID="677a82b4c3016a1729d191a3a336cdb42ad4a924c534d583daf6c1ed06fe87bb" Apr 17 14:36:36.651629 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.651589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:36:36.651798 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.651650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:36:36.653987 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.653960 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0-metrics-tls\") pod \"dns-default-t99cr\" (UID: \"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0\") " pod="openshift-dns/dns-default-t99cr" Apr 17 14:36:36.654096 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.653971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed-cert\") pod \"ingress-canary-r25f2\" (UID: \"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed\") " pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:36:36.722870 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.722832 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fr4dp\"" Apr 17 14:36:36.730865 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.730842 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t99cr" Apr 17 14:36:36.841557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.841385 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t99cr"] Apr 17 14:36:36.843913 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:36:36.843887 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfce4b3ea_fa1f_49ea_9006_8df35ffaaaf0.slice/crio-7eaa018cedf94d6f98af0b5be5e02d923ac8172e65136359c575c1b55aa62715 WatchSource:0}: Error finding container 7eaa018cedf94d6f98af0b5be5e02d923ac8172e65136359c575c1b55aa62715: Status 404 returned error can't find the container with id 7eaa018cedf94d6f98af0b5be5e02d923ac8172e65136359c575c1b55aa62715 Apr 17 14:36:36.985058 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:36.984965 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:36:37.236127 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:37.236043 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t99cr" event={"ID":"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0","Type":"ContainerStarted","Data":"7eaa018cedf94d6f98af0b5be5e02d923ac8172e65136359c575c1b55aa62715"} Apr 17 14:36:37.238012 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:37.237989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" event={"ID":"9381de67-e5f0-4a19-bd6a-1041cf6f0c8b","Type":"ContainerStarted","Data":"3eb64ce136a14453d4ecc7de060566f66466b1ffba14a72a88f9f241f28e4edc"} Apr 17 14:36:37.238281 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:37.238261 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:36:37.239103 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:37.239085 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c4b667cb5-pg4vw" Apr 17 14:36:37.239897 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:37.239875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb4d6b5ff-mb4f9" event={"ID":"0237de7d-a2e2-4924-bc12-962079df4f96","Type":"ContainerStarted","Data":"d0437c8fef27d58eaa95276f0ad0ecdc712ecbd46da9047b99107f951f595949"} Apr 17 14:36:38.244363 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:38.244326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t99cr" event={"ID":"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0","Type":"ContainerStarted","Data":"26b8ac305502c47c9f7e57f75b47fa370b67decf53bf8ebcab2acdf12378bc14"} Apr 17 14:36:38.244774 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:38.244373 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t99cr" event={"ID":"fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0","Type":"ContainerStarted","Data":"6d85efe73c2063a825131aa57cabd38181ffe4baa3ad4cc37a46cf305ac2a564"} Apr 17 14:36:38.260241 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:38.260199 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t99cr" podStartSLOduration=129.101333857 podStartE2EDuration="2m10.260183534s" podCreationTimestamp="2026-04-17 14:34:28 +0000 UTC" firstStartedPulling="2026-04-17 14:36:36.845734936 +0000 UTC m=+165.582962915" lastFinishedPulling="2026-04-17 14:36:38.004584609 +0000 UTC m=+166.741812592" observedRunningTime="2026-04-17 14:36:38.259422139 +0000 UTC m=+166.996650146" watchObservedRunningTime="2026-04-17 14:36:38.260183534 +0000 UTC m=+166.997411536" Apr 17 14:36:39.247451 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:39.247422 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-t99cr" Apr 17 14:36:41.870759 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.870673 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5bjx6"] Apr 17 14:36:41.873776 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.873758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:41.875938 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.875917 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:36:41.876046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.875965 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:36:41.876815 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.876798 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:36:41.876887 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.876853 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qsgzg\"" Apr 17 14:36:41.876887 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.876863 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:36:41.884248 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.884205 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5bjx6"] Apr 17 14:36:41.937746 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.937717 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7dff97d6d-4f999"] Apr 17 14:36:41.940656 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.940639 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:41.944400 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.944365 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:36:41.944617 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.944602 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:36:41.945176 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.945132 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:36:41.945267 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.945197 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qzqq8\"" Apr 17 14:36:41.955006 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.954987 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:36:41.960934 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.960914 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7dff97d6d-4f999"] Apr 17 14:36:41.991601 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/72340448-e6b1-4975-8a57-ec81d546fee4-data-volume\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:41.991692 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991606 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/72340448-e6b1-4975-8a57-ec81d546fee4-crio-socket\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:41.991692 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991627 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-bound-sa-token\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:41.991692 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-registry-tls\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:41.991792 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/72340448-e6b1-4975-8a57-ec81d546fee4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:41.991792 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bznkw\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-kube-api-access-bznkw\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:41.991792 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/72340448-e6b1-4975-8a57-ec81d546fee4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:41.991792 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrx5\" (UniqueName: \"kubernetes.io/projected/72340448-e6b1-4975-8a57-ec81d546fee4-kube-api-access-xzrx5\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:41.991931 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-registry-certificates\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:41.991931 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-installation-pull-secrets\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:41.991986 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-ca-trust-extracted\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:41.992031 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.991988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-image-registry-private-configuration\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:41.992031 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:41.992006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-trusted-ca\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.092903 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.092872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-image-registry-private-configuration\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.092903 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.092903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-trusted-ca\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.093132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.092927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/72340448-e6b1-4975-8a57-ec81d546fee4-data-volume\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.093132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.092951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/72340448-e6b1-4975-8a57-ec81d546fee4-crio-socket\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.093132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.092980 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-bound-sa-token\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.093132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-registry-tls\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.093132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/72340448-e6b1-4975-8a57-ec81d546fee4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.093132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/72340448-e6b1-4975-8a57-ec81d546fee4-crio-socket\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.093132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bznkw\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-kube-api-access-bznkw\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.093132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/72340448-e6b1-4975-8a57-ec81d546fee4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.093557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrx5\" (UniqueName: \"kubernetes.io/projected/72340448-e6b1-4975-8a57-ec81d546fee4-kube-api-access-xzrx5\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.093557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-registry-certificates\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.093557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-installation-pull-secrets\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.093557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-ca-trust-extracted\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.093557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093355 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/72340448-e6b1-4975-8a57-ec81d546fee4-data-volume\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.093821 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/72340448-e6b1-4975-8a57-ec81d546fee4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.093821 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.093773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-ca-trust-extracted\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.094270 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.094246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-registry-certificates\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.094649 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.094624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-trusted-ca\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.095602 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.095580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-image-registry-private-configuration\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.095697 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.095668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/72340448-e6b1-4975-8a57-ec81d546fee4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.095754 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.095739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-registry-tls\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.095859 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.095840 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-installation-pull-secrets\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.103127 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.103092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bznkw\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-kube-api-access-bznkw\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.103260 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.103235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a63267e-0738-4e6d-bbe4-4cdb4beb3002-bound-sa-token\") pod \"image-registry-7dff97d6d-4f999\" (UID: \"6a63267e-0738-4e6d-bbe4-4cdb4beb3002\") " pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.103465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.103446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrx5\" (UniqueName: \"kubernetes.io/projected/72340448-e6b1-4975-8a57-ec81d546fee4-kube-api-access-xzrx5\") pod \"insights-runtime-extractor-5bjx6\" (UID: \"72340448-e6b1-4975-8a57-ec81d546fee4\") " pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.181914 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.181845 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5bjx6" Apr 17 14:36:42.248951 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.248923 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:42.296043 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.296011 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5bjx6"] Apr 17 14:36:42.299794 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:36:42.299745 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72340448_e6b1_4975_8a57_ec81d546fee4.slice/crio-26cba0d2874e3685243d5ad07f15f75b14550b91ae1ac13db50b649e7852b4d5 WatchSource:0}: Error finding container 26cba0d2874e3685243d5ad07f15f75b14550b91ae1ac13db50b649e7852b4d5: Status 404 returned error can't find the container with id 26cba0d2874e3685243d5ad07f15f75b14550b91ae1ac13db50b649e7852b4d5 Apr 17 14:36:42.374282 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.374246 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7dff97d6d-4f999"] Apr 17 14:36:42.377219 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:36:42.377184 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a63267e_0738_4e6d_bbe4_4cdb4beb3002.slice/crio-1fedb79d1d4fdfce07b396540ebe42c04aceb15f55f6cd67efb8ad0198428308 WatchSource:0}: Error finding container 1fedb79d1d4fdfce07b396540ebe42c04aceb15f55f6cd67efb8ad0198428308: Status 404 returned error can't find the container with id 1fedb79d1d4fdfce07b396540ebe42c04aceb15f55f6cd67efb8ad0198428308 Apr 17 14:36:42.796041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.796001 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:36:42.798454 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.798433 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dtcwn\"" Apr 17 14:36:42.806467 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.806443 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r25f2" Apr 17 14:36:42.925694 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:42.925666 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r25f2"] Apr 17 14:36:42.928165 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:36:42.928120 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf3539d9_3c50_4eb7_8ff9_72a06eb7b7ed.slice/crio-6507e9faa814122ed47492e06d34a1808121238f6f148ae277bc675172f8a822 WatchSource:0}: Error finding container 6507e9faa814122ed47492e06d34a1808121238f6f148ae277bc675172f8a822: Status 404 returned error can't find the container with id 6507e9faa814122ed47492e06d34a1808121238f6f148ae277bc675172f8a822 Apr 17 14:36:43.261922 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:43.261889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7dff97d6d-4f999" event={"ID":"6a63267e-0738-4e6d-bbe4-4cdb4beb3002","Type":"ContainerStarted","Data":"56090704b26afba7ad931a42311ba2aafdcc8412ccebe66363b6da7f6a848c3a"} Apr 17 14:36:43.261922 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:43.261923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7dff97d6d-4f999" event={"ID":"6a63267e-0738-4e6d-bbe4-4cdb4beb3002","Type":"ContainerStarted","Data":"1fedb79d1d4fdfce07b396540ebe42c04aceb15f55f6cd67efb8ad0198428308"} Apr 17 14:36:43.262185 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:43.262002 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:36:43.262811 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:43.262792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r25f2" event={"ID":"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed","Type":"ContainerStarted","Data":"6507e9faa814122ed47492e06d34a1808121238f6f148ae277bc675172f8a822"} Apr 17 14:36:43.264233 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:43.264211 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5bjx6" event={"ID":"72340448-e6b1-4975-8a57-ec81d546fee4","Type":"ContainerStarted","Data":"4733256f365b3d006af1635b9f8b79d184b1a23c805f70576a270aee177a340b"} Apr 17 14:36:43.264316 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:43.264238 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5bjx6" event={"ID":"72340448-e6b1-4975-8a57-ec81d546fee4","Type":"ContainerStarted","Data":"af7c274b20a1b5ae9cf3602dc8990940309aa75682dff7b9f964069e67ea5701"} Apr 17 14:36:43.264316 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:43.264252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5bjx6" event={"ID":"72340448-e6b1-4975-8a57-ec81d546fee4","Type":"ContainerStarted","Data":"26cba0d2874e3685243d5ad07f15f75b14550b91ae1ac13db50b649e7852b4d5"} Apr 17 14:36:43.279843 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:43.279802 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7dff97d6d-4f999" podStartSLOduration=2.279791394 podStartE2EDuration="2.279791394s" podCreationTimestamp="2026-04-17 14:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:36:43.278571159 +0000 UTC m=+172.015799161" watchObservedRunningTime="2026-04-17 14:36:43.279791394 +0000 UTC m=+172.017019386" Apr 17 14:36:45.270024 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:45.269989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r25f2" event={"ID":"df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed","Type":"ContainerStarted","Data":"065fd1fa0713b0ff77293d2477cba2d15dd634a79acb902c0e49aee871440437"} Apr 17 14:36:45.271716 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:45.271696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5bjx6" event={"ID":"72340448-e6b1-4975-8a57-ec81d546fee4","Type":"ContainerStarted","Data":"48d04015a510518d65f52729a68fe16f91cf1a006ef84a31441a08e3c19a7e43"} Apr 17 14:36:45.285361 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:45.285316 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r25f2" podStartSLOduration=135.273356883 podStartE2EDuration="2m17.285303486s" podCreationTimestamp="2026-04-17 14:34:28 +0000 UTC" firstStartedPulling="2026-04-17 14:36:42.930220567 +0000 UTC m=+171.667448546" lastFinishedPulling="2026-04-17 14:36:44.94216717 +0000 UTC m=+173.679395149" observedRunningTime="2026-04-17 14:36:45.284479881 +0000 UTC m=+174.021707882" watchObservedRunningTime="2026-04-17 14:36:45.285303486 +0000 UTC m=+174.022531486" Apr 17 14:36:45.300126 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:45.300070 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5bjx6" podStartSLOduration=1.716309147 podStartE2EDuration="4.300054882s" podCreationTimestamp="2026-04-17 14:36:41 +0000 UTC" firstStartedPulling="2026-04-17 14:36:42.35979981 +0000 UTC m=+171.097027806" lastFinishedPulling="2026-04-17 14:36:44.94354555 +0000 UTC m=+173.680773541" observedRunningTime="2026-04-17 14:36:45.29990114 +0000 UTC m=+174.037129141" watchObservedRunningTime="2026-04-17 14:36:45.300054882 +0000 UTC m=+174.037282884" Apr 17 14:36:45.795657 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:45.795621 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:36:48.445178 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:48.445116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:48.445724 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:48.445701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c83797-bcd2-4b74-a591-ecde3243182e-service-ca-bundle\") pod \"router-default-6d757c5ffb-lfbbq\" (UID: \"f6c83797-bcd2-4b74-a591-ecde3243182e\") " pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:48.614806 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:48.614763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:48.727779 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:48.727757 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6d757c5ffb-lfbbq"] Apr 17 14:36:48.730013 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:36:48.729980 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c83797_bcd2_4b74_a591_ecde3243182e.slice/crio-e7c34066748ac58e7f94fbc24b2e88b10ebe91b5919a30c054c7ea9876aea057 WatchSource:0}: Error finding container e7c34066748ac58e7f94fbc24b2e88b10ebe91b5919a30c054c7ea9876aea057: Status 404 returned error can't find the container with id e7c34066748ac58e7f94fbc24b2e88b10ebe91b5919a30c054c7ea9876aea057 Apr 17 14:36:49.251872 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:49.251840 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t99cr" Apr 17 14:36:49.285271 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:49.285234 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" event={"ID":"f6c83797-bcd2-4b74-a591-ecde3243182e","Type":"ContainerStarted","Data":"4538b67743f5f1d942c234515ca377a31f698aeefc35fd40c57e71bd8f5830d0"} Apr 17 14:36:49.285416 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:49.285277 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" event={"ID":"f6c83797-bcd2-4b74-a591-ecde3243182e","Type":"ContainerStarted","Data":"e7c34066748ac58e7f94fbc24b2e88b10ebe91b5919a30c054c7ea9876aea057"} Apr 17 14:36:49.302171 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:49.302105 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" podStartSLOduration=33.302089299 podStartE2EDuration="33.302089299s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:36:49.301348637 +0000 UTC m=+178.038576639" watchObservedRunningTime="2026-04-17 14:36:49.302089299 +0000 UTC m=+178.039317300" Apr 17 14:36:49.615570 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:49.615532 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:49.618194 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:49.618165 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:50.287806 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:50.287776 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:36:50.289459 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:36:50.289440 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6d757c5ffb-lfbbq" Apr 17 14:37:00.421830 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.421799 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9nqpt"] Apr 17 14:37:00.424705 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.424687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.426988 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.426962 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:37:00.426988 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.426981 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:37:00.427204 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.427003 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:37:00.427342 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.427325 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:37:00.428016 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.427997 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:37:00.428074 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.428002 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:37:00.428074 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.428032 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zfm9f\"" Apr 17 14:37:00.531402 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/044d54bf-e77d-422e-b7ca-a2f4a003b98a-metrics-client-ca\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.531580 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-sys\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.531580 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-accelerators-collector-config\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.531580 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-wtmp\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.531580 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.531720 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-tls\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.531720 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-textfile\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.531720 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531639 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnh2w\" (UniqueName: \"kubernetes.io/projected/044d54bf-e77d-422e-b7ca-a2f4a003b98a-kube-api-access-hnh2w\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.531720 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.531657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-root\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.632882 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.632832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.632882 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.632894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-tls\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.632924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-textfile\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.632941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnh2w\" (UniqueName: \"kubernetes.io/projected/044d54bf-e77d-422e-b7ca-a2f4a003b98a-kube-api-access-hnh2w\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.632958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-root\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.632982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/044d54bf-e77d-422e-b7ca-a2f4a003b98a-metrics-client-ca\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-sys\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-accelerators-collector-config\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-wtmp\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-root\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633109 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-sys\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633525 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:37:00.633093 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 14:37:00.633525 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:37:00.633214 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-tls podName:044d54bf-e77d-422e-b7ca-a2f4a003b98a nodeName:}" failed. No retries permitted until 2026-04-17 14:37:01.133192366 +0000 UTC m=+189.870420360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-tls") pod "node-exporter-9nqpt" (UID: "044d54bf-e77d-422e-b7ca-a2f4a003b98a") : secret "node-exporter-tls" not found Apr 17 14:37:00.633525 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633245 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-wtmp\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633525 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-textfile\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-accelerators-collector-config\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.633668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.633659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/044d54bf-e77d-422e-b7ca-a2f4a003b98a-metrics-client-ca\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.635311 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.635289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:00.642673 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:00.642645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnh2w\" (UniqueName: \"kubernetes.io/projected/044d54bf-e77d-422e-b7ca-a2f4a003b98a-kube-api-access-hnh2w\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:01.137857 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:01.137821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-tls\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:01.140026 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:01.139994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/044d54bf-e77d-422e-b7ca-a2f4a003b98a-node-exporter-tls\") pod \"node-exporter-9nqpt\" (UID: \"044d54bf-e77d-422e-b7ca-a2f4a003b98a\") " pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:01.334076 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:01.334046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9nqpt" Apr 17 14:37:01.342560 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:37:01.342528 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod044d54bf_e77d_422e_b7ca_a2f4a003b98a.slice/crio-ce6a58425b66b84a763aa46a3c83b51d9b6d9823f3d8cecdcd2bad083ec9aefe WatchSource:0}: Error finding container ce6a58425b66b84a763aa46a3c83b51d9b6d9823f3d8cecdcd2bad083ec9aefe: Status 404 returned error can't find the container with id ce6a58425b66b84a763aa46a3c83b51d9b6d9823f3d8cecdcd2bad083ec9aefe Apr 17 14:37:02.316535 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:02.316486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9nqpt" event={"ID":"044d54bf-e77d-422e-b7ca-a2f4a003b98a","Type":"ContainerStarted","Data":"fd4ac341a1b10e6b95971d13126685ca54f8477e695ff3070a242db368eb54bb"} Apr 17 14:37:02.316535 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:02.316537 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9nqpt" event={"ID":"044d54bf-e77d-422e-b7ca-a2f4a003b98a","Type":"ContainerStarted","Data":"ce6a58425b66b84a763aa46a3c83b51d9b6d9823f3d8cecdcd2bad083ec9aefe"} Apr 17 14:37:03.319756 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:03.319727 2577 generic.go:358] "Generic (PLEG): container finished" podID="044d54bf-e77d-422e-b7ca-a2f4a003b98a" containerID="fd4ac341a1b10e6b95971d13126685ca54f8477e695ff3070a242db368eb54bb" exitCode=0 Apr 17 14:37:03.320111 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:03.319799 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9nqpt" event={"ID":"044d54bf-e77d-422e-b7ca-a2f4a003b98a","Type":"ContainerDied","Data":"fd4ac341a1b10e6b95971d13126685ca54f8477e695ff3070a242db368eb54bb"} Apr 17 14:37:04.270580 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.270551 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7dff97d6d-4f999" Apr 17 14:37:04.324863 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.324816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9nqpt" event={"ID":"044d54bf-e77d-422e-b7ca-a2f4a003b98a","Type":"ContainerStarted","Data":"f55e22f27e68a52e104da8528da0f6203ec8844a7bcb9d0e8bc9ccb4514d54fd"} Apr 17 14:37:04.324863 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.324852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9nqpt" event={"ID":"044d54bf-e77d-422e-b7ca-a2f4a003b98a","Type":"ContainerStarted","Data":"e23784d29bacbfdd301075fab8ed203b6e53f280dba8856759a42a39f74b564c"} Apr 17 14:37:04.381445 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.381377 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9nqpt" podStartSLOduration=3.572476555 podStartE2EDuration="4.381361934s" podCreationTimestamp="2026-04-17 14:37:00 +0000 UTC" firstStartedPulling="2026-04-17 14:37:01.344421391 +0000 UTC m=+190.081649377" lastFinishedPulling="2026-04-17 14:37:02.153306777 +0000 UTC m=+190.890534756" observedRunningTime="2026-04-17 14:37:04.380760914 +0000 UTC m=+193.117988916" watchObservedRunningTime="2026-04-17 14:37:04.381361934 +0000 UTC m=+193.118589934" Apr 17 14:37:04.881738 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.881709 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-b7fd7b587-d4d47"] Apr 17 14:37:04.883681 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.883666 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:04.885947 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.885924 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 14:37:04.886067 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.886025 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 14:37:04.887112 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.887095 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 14:37:04.887320 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.887301 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-b9es1tlne2k5m\"" Apr 17 14:37:04.887392 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.887324 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-5frt9\"" Apr 17 14:37:04.887392 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.887376 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 14:37:04.894643 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.894623 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b7fd7b587-d4d47"] Apr 17 14:37:04.972473 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.972433 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-secret-metrics-server-tls\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:04.972648 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.972487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-secret-metrics-server-client-certs\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:04.972648 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.972515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp56d\" (UniqueName: \"kubernetes.io/projected/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-kube-api-access-rp56d\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:04.972648 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.972553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:04.972648 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.972582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-metrics-server-audit-profiles\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:04.972812 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.972642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-client-ca-bundle\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:04.972812 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:04.972673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-audit-log\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.073937 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.073902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-secret-metrics-server-tls\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.073937 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.073941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-secret-metrics-server-client-certs\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.074207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.073960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp56d\" (UniqueName: \"kubernetes.io/projected/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-kube-api-access-rp56d\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.074207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.073987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.074207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.074005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-metrics-server-audit-profiles\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.074207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.074037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-client-ca-bundle\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.074207 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.074055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-audit-log\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.074565 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.074536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-audit-log\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.074917 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.074890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.075020 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.075009 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-metrics-server-audit-profiles\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.076478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.076446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-secret-metrics-server-tls\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.076624 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.076604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-secret-metrics-server-client-certs\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.076863 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.076843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-client-ca-bundle\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.081218 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.081197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp56d\" (UniqueName: \"kubernetes.io/projected/24fe5dfb-a1f2-45df-8eb5-c03383ef1f51-kube-api-access-rp56d\") pod \"metrics-server-b7fd7b587-d4d47\" (UID: \"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51\") " pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.168801 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.168710 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm"] Apr 17 14:37:05.171529 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.171506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" Apr 17 14:37:05.173775 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.173752 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 14:37:05.173871 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.173762 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-zbnpr\"" Apr 17 14:37:05.178255 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.178230 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm"] Apr 17 14:37:05.192932 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.192908 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:05.275695 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.275651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7a7eeed-14fd-46f7-aa37-49304a1e871a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rf2lm\" (UID: \"c7a7eeed-14fd-46f7-aa37-49304a1e871a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" Apr 17 14:37:05.304929 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.304900 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b7fd7b587-d4d47"] Apr 17 14:37:05.307816 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:37:05.307786 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24fe5dfb_a1f2_45df_8eb5_c03383ef1f51.slice/crio-6776469b5011913b083b394c9087fbdf2fdb5d8ec937dca24e1f52ad734a00d2 WatchSource:0}: Error finding container 6776469b5011913b083b394c9087fbdf2fdb5d8ec937dca24e1f52ad734a00d2: Status 404 returned error can't find the container with id 6776469b5011913b083b394c9087fbdf2fdb5d8ec937dca24e1f52ad734a00d2 Apr 17 14:37:05.328285 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.328260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" event={"ID":"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51","Type":"ContainerStarted","Data":"6776469b5011913b083b394c9087fbdf2fdb5d8ec937dca24e1f52ad734a00d2"} Apr 17 14:37:05.376724 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.376690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7a7eeed-14fd-46f7-aa37-49304a1e871a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rf2lm\" (UID: \"c7a7eeed-14fd-46f7-aa37-49304a1e871a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" Apr 17 14:37:05.376856 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:37:05.376824 2577 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 14:37:05.376900 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:37:05.376885 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7a7eeed-14fd-46f7-aa37-49304a1e871a-monitoring-plugin-cert podName:c7a7eeed-14fd-46f7-aa37-49304a1e871a nodeName:}" failed. No retries permitted until 2026-04-17 14:37:05.876869867 +0000 UTC m=+194.614097852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/c7a7eeed-14fd-46f7-aa37-49304a1e871a-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-rf2lm" (UID: "c7a7eeed-14fd-46f7-aa37-49304a1e871a") : secret "monitoring-plugin-cert" not found Apr 17 14:37:05.880507 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.880468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7a7eeed-14fd-46f7-aa37-49304a1e871a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rf2lm\" (UID: \"c7a7eeed-14fd-46f7-aa37-49304a1e871a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" Apr 17 14:37:05.883204 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:05.883180 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7a7eeed-14fd-46f7-aa37-49304a1e871a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rf2lm\" (UID: \"c7a7eeed-14fd-46f7-aa37-49304a1e871a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" Apr 17 14:37:06.081079 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.081047 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" Apr 17 14:37:06.209525 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.209478 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm"] Apr 17 14:37:06.214171 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:37:06.214125 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7a7eeed_14fd_46f7_aa37_49304a1e871a.slice/crio-412aba7b8d533ab0879f7403ab716a3e21d021df6c6e8e519b3952b896b6267d WatchSource:0}: Error finding container 412aba7b8d533ab0879f7403ab716a3e21d021df6c6e8e519b3952b896b6267d: Status 404 returned error can't find the container with id 412aba7b8d533ab0879f7403ab716a3e21d021df6c6e8e519b3952b896b6267d Apr 17 14:37:06.332532 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.332489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" event={"ID":"c7a7eeed-14fd-46f7-aa37-49304a1e871a","Type":"ContainerStarted","Data":"412aba7b8d533ab0879f7403ab716a3e21d021df6c6e8e519b3952b896b6267d"} Apr 17 14:37:06.580306 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.580274 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:37:06.583113 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.583092 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.585813 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.585489 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-dqltf\"" Apr 17 14:37:06.585813 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.585530 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 14:37:06.585813 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.585560 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 14:37:06.585813 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.585654 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 14:37:06.585813 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.585812 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-c1pivc3p2ncpj\"" Apr 17 14:37:06.586309 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.586252 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 14:37:06.586411 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.586381 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 14:37:06.586471 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.586420 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 14:37:06.586471 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.586449 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 14:37:06.586574 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.586561 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 14:37:06.586723 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.586703 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 14:37:06.586811 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.586753 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 14:37:06.588781 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.588762 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 14:37:06.591638 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.591619 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 14:37:06.598440 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.598419 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:37:06.688904 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.688882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.688985 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.688913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-web-config\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.688985 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.688932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ntng\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-kube-api-access-5ntng\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.688985 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.688973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689044 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689143 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689143 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689119 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689231 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689231 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689304 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689304 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689368 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689368 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689439 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689370 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689439 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689439 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config-out\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.689547 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.689452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.790701 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.790662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.790915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.790712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.790915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.790746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.790915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.790782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.790915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.790818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.790915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.790846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.790915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.790880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.790915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.790908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791146 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config-out\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-web-config\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ntng\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-kube-api-access-5ntng\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791763 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.791763 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.791648 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.792478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.792449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.795020 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.794898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.795693 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.795654 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.795939 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.795923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796053 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.795946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796138 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.795986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796261 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.796208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796420 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.796359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config-out\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796557 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.796536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796786 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.796768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796870 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.796799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796870 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.796823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.796975 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.796918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.797227 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.797206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-web-config\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.797309 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.797242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.797936 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.797918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.806562 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.806536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ntng\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-kube-api-access-5ntng\") pod \"prometheus-k8s-0\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:06.895705 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:06.895619 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:07.037804 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:07.037772 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:37:07.041229 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:37:07.041195 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81332ce6_34ef_4fc2_ade5_81df5c9ccb70.slice/crio-ed089ab7a66084c6ee1d3f53db736c2bda628e6ddf61257af7dba5046003f972 WatchSource:0}: Error finding container ed089ab7a66084c6ee1d3f53db736c2bda628e6ddf61257af7dba5046003f972: Status 404 returned error can't find the container with id ed089ab7a66084c6ee1d3f53db736c2bda628e6ddf61257af7dba5046003f972 Apr 17 14:37:07.337131 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:07.337088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerStarted","Data":"ed089ab7a66084c6ee1d3f53db736c2bda628e6ddf61257af7dba5046003f972"} Apr 17 14:37:07.338756 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:07.338727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" event={"ID":"24fe5dfb-a1f2-45df-8eb5-c03383ef1f51","Type":"ContainerStarted","Data":"9c3f0e071287a3b284e80978fe8094f779360d2a2b3324a8657a8306bab2e702"} Apr 17 14:37:07.354671 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:07.354597 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" podStartSLOduration=1.9849646220000001 podStartE2EDuration="3.354575149s" podCreationTimestamp="2026-04-17 14:37:04 +0000 UTC" firstStartedPulling="2026-04-17 14:37:05.309673535 +0000 UTC m=+194.046901514" lastFinishedPulling="2026-04-17 14:37:06.67928406 +0000 UTC m=+195.416512041" observedRunningTime="2026-04-17 14:37:07.354407968 +0000 UTC m=+196.091635970" watchObservedRunningTime="2026-04-17 14:37:07.354575149 +0000 UTC m=+196.091803150" Apr 17 14:37:08.343926 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:08.343885 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" event={"ID":"c7a7eeed-14fd-46f7-aa37-49304a1e871a","Type":"ContainerStarted","Data":"aed67fe76a19f9fedb172f4dd1e60526838e52847314549dadf57bde1d007904"} Apr 17 14:37:08.344365 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:08.344085 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" Apr 17 14:37:08.345318 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:08.345294 2577 generic.go:358] "Generic (PLEG): container finished" podID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" exitCode=0 Apr 17 14:37:08.345448 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:08.345380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerDied","Data":"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5"} Apr 17 14:37:08.349315 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:08.349296 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" Apr 17 14:37:08.357789 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:08.357750 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rf2lm" podStartSLOduration=1.978057684 podStartE2EDuration="3.357729372s" podCreationTimestamp="2026-04-17 14:37:05 +0000 UTC" firstStartedPulling="2026-04-17 14:37:06.216556607 +0000 UTC m=+194.953784588" lastFinishedPulling="2026-04-17 14:37:07.596228286 +0000 UTC m=+196.333456276" observedRunningTime="2026-04-17 14:37:08.356590148 +0000 UTC m=+197.093818153" watchObservedRunningTime="2026-04-17 14:37:08.357729372 +0000 UTC m=+197.094957373" Apr 17 14:37:11.355415 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:11.355373 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerStarted","Data":"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8"} Apr 17 14:37:11.355415 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:11.355410 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerStarted","Data":"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a"} Apr 17 14:37:13.364093 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:13.364055 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerStarted","Data":"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b"} Apr 17 14:37:13.364093 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:13.364092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerStarted","Data":"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770"} Apr 17 14:37:13.364500 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:13.364105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerStarted","Data":"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f"} Apr 17 14:37:13.364500 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:13.364116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerStarted","Data":"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10"} Apr 17 14:37:13.390390 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:13.390331 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.695371435 podStartE2EDuration="7.390316873s" podCreationTimestamp="2026-04-17 14:37:06 +0000 UTC" firstStartedPulling="2026-04-17 14:37:07.043605022 +0000 UTC m=+195.780833002" lastFinishedPulling="2026-04-17 14:37:12.738550461 +0000 UTC m=+201.475778440" observedRunningTime="2026-04-17 14:37:13.388328 +0000 UTC m=+202.125556001" watchObservedRunningTime="2026-04-17 14:37:13.390316873 +0000 UTC m=+202.127544871" Apr 17 14:37:16.896532 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:16.896496 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:37:25.193889 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:25.193847 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:25.193889 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:25.193889 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:34.815614 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:34.815583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6d757c5ffb-lfbbq_f6c83797-bcd2-4b74-a591-ecde3243182e/router/0.log" Apr 17 14:37:34.833376 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:34.833349 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r25f2_df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed/serve-healthcheck-canary/0.log" Apr 17 14:37:40.440790 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:40.440750 2577 generic.go:358] "Generic (PLEG): container finished" podID="3e50beca-69e6-4881-82d9-7f37412e8889" containerID="333849502cb4a5b3df01e46d475af9a6741a999ec277095784a58eea5c68ce9e" exitCode=0 Apr 17 14:37:40.441187 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:40.440803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" event={"ID":"3e50beca-69e6-4881-82d9-7f37412e8889","Type":"ContainerDied","Data":"333849502cb4a5b3df01e46d475af9a6741a999ec277095784a58eea5c68ce9e"} Apr 17 14:37:40.441187 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:40.441131 2577 scope.go:117] "RemoveContainer" containerID="333849502cb4a5b3df01e46d475af9a6741a999ec277095784a58eea5c68ce9e" Apr 17 14:37:41.444968 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:41.444933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mf4lh" event={"ID":"3e50beca-69e6-4881-82d9-7f37412e8889","Type":"ContainerStarted","Data":"e7b198c5891ea1eeeeb3f82fb376edbf18dd3b3ad6c3469ebc8467877033d257"} Apr 17 14:37:45.198669 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:45.198638 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:37:45.203405 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:37:45.203382 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-b7fd7b587-d4d47" Apr 17 14:38:03.788018 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:03.787985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:38:03.790307 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:03.790286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87-metrics-certs\") pod \"network-metrics-daemon-4nvfl\" (UID: \"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87\") " pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:38:03.798474 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:03.798438 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tg92k\"" Apr 17 14:38:03.806324 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:03.806302 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4nvfl" Apr 17 14:38:03.926720 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:03.926599 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4nvfl"] Apr 17 14:38:03.929690 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:38:03.929658 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b3b9c0_3f2e_4dbe_b825_c2fe73140a87.slice/crio-8d70b9c0a702e91e9cf6d7bd0dd8f3d3fad7952fe52328045e4348ce83d05e09 WatchSource:0}: Error finding container 8d70b9c0a702e91e9cf6d7bd0dd8f3d3fad7952fe52328045e4348ce83d05e09: Status 404 returned error can't find the container with id 8d70b9c0a702e91e9cf6d7bd0dd8f3d3fad7952fe52328045e4348ce83d05e09 Apr 17 14:38:04.512190 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:04.512130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4nvfl" event={"ID":"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87","Type":"ContainerStarted","Data":"8d70b9c0a702e91e9cf6d7bd0dd8f3d3fad7952fe52328045e4348ce83d05e09"} Apr 17 14:38:05.516238 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:05.516201 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4nvfl" event={"ID":"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87","Type":"ContainerStarted","Data":"2cf18ebb9e65a98b8168fbe7b80c44dc6b4f950ddbb6382d339e139c423fe2a4"} Apr 17 14:38:05.516607 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:05.516245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4nvfl" event={"ID":"a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87","Type":"ContainerStarted","Data":"9421a37ca2264e89aa7b748a00858b2e1e10bcb069becf06ea31b65fc84e95b4"} Apr 17 14:38:05.532024 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:05.531974 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4nvfl" podStartSLOduration=252.231144285 podStartE2EDuration="4m13.531960383s" podCreationTimestamp="2026-04-17 14:33:52 +0000 UTC" firstStartedPulling="2026-04-17 14:38:03.931432901 +0000 UTC m=+252.668660879" lastFinishedPulling="2026-04-17 14:38:05.232248999 +0000 UTC m=+253.969476977" observedRunningTime="2026-04-17 14:38:05.529825139 +0000 UTC m=+254.267053150" watchObservedRunningTime="2026-04-17 14:38:05.531960383 +0000 UTC m=+254.269188382" Apr 17 14:38:06.896217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:06.896184 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:06.911472 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:06.911448 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:07.536437 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:07.536400 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:24.954827 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:24.954729 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:38:24.955431 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:24.955369 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="prometheus" containerID="cri-o://05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" gracePeriod=600 Apr 17 14:38:24.955562 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:24.955391 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy-web" containerID="cri-o://629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" gracePeriod=600 Apr 17 14:38:24.955562 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:24.955423 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="thanos-sidecar" containerID="cri-o://83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" gracePeriod=600 Apr 17 14:38:24.955562 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:24.955460 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="config-reloader" containerID="cri-o://9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" gracePeriod=600 Apr 17 14:38:24.955562 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:24.955458 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy" containerID="cri-o://9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" gracePeriod=600 Apr 17 14:38:24.955562 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:24.955408 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy-thanos" containerID="cri-o://4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" gracePeriod=600 Apr 17 14:38:25.193586 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.193555 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.265169 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265127 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-web-config\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265325 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265184 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-serving-certs-ca-bundle\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265325 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265203 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265325 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265228 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ntng\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-kube-api-access-5ntng\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265325 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265245 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-tls\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265325 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265265 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-db\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265347 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-kubelet-serving-ca-bundle\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265385 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-kube-rbac-proxy\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265411 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-trusted-ca-bundle\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265449 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-grpc-tls\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265494 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config-out\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265512 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-metrics-client-ca\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265549 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265528 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-rulefiles-0\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265893 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265555 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-thanos-prometheus-http-client-file\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265893 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265596 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265893 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265635 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-tls-assets\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265893 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265661 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-metrics-client-certs\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265893 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265690 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\" (UID: \"81332ce6-34ef-4fc2-ade5-81df5c9ccb70\") " Apr 17 14:38:25.265893 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265865 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:25.266218 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.265972 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.266344 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.266304 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:25.266761 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.266501 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:25.267300 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.267020 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:25.268103 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.267501 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:38:25.268592 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.268547 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.269186 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.269138 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:25.269311 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.269178 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-kube-api-access-5ntng" (OuterVolumeSpecName: "kube-api-access-5ntng") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "kube-api-access-5ntng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:38:25.269625 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.269589 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config" (OuterVolumeSpecName: "config") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.269890 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.269810 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.270006 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.269904 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config-out" (OuterVolumeSpecName: "config-out") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:38:25.270006 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.269959 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.270006 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.269972 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.270210 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.270192 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:38:25.270328 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.270267 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.271318 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.271297 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.271461 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.271436 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.278709 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.278688 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-web-config" (OuterVolumeSpecName: "web-config") pod "81332ce6-34ef-4fc2-ade5-81df5c9ccb70" (UID: "81332ce6-34ef-4fc2-ade5-81df5c9ccb70"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:25.366827 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366791 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.366827 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366820 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-db\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.366827 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366830 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-kube-rbac-proxy\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366839 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366849 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-grpc-tls\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366858 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config-out\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366866 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-metrics-client-ca\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366874 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366883 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366893 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-config\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366901 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-tls-assets\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366910 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-metrics-client-certs\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366919 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366928 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-web-config\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366937 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366945 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.367046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.366954 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ntng\" (UniqueName: \"kubernetes.io/projected/81332ce6-34ef-4fc2-ade5-81df5c9ccb70-kube-api-access-5ntng\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:38:25.572554 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572461 2577 generic.go:358] "Generic (PLEG): container finished" podID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" exitCode=0 Apr 17 14:38:25.572554 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572489 2577 generic.go:358] "Generic (PLEG): container finished" podID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" exitCode=0 Apr 17 14:38:25.572554 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572497 2577 generic.go:358] "Generic (PLEG): container finished" podID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" exitCode=0 Apr 17 14:38:25.572554 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572503 2577 generic.go:358] "Generic (PLEG): container finished" podID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" exitCode=0 Apr 17 14:38:25.572554 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572510 2577 generic.go:358] "Generic (PLEG): container finished" podID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" exitCode=0 Apr 17 14:38:25.572554 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572515 2577 generic.go:358] "Generic (PLEG): container finished" podID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" exitCode=0 Apr 17 14:38:25.572554 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572545 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerDied","Data":"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b"} Apr 17 14:38:25.572999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572584 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerDied","Data":"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770"} Apr 17 14:38:25.572999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572588 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.572999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572605 2577 scope.go:117] "RemoveContainer" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" Apr 17 14:38:25.572999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerDied","Data":"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f"} Apr 17 14:38:25.572999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerDied","Data":"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10"} Apr 17 14:38:25.572999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerDied","Data":"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8"} Apr 17 14:38:25.572999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerDied","Data":"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a"} Apr 17 14:38:25.572999 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.572770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"81332ce6-34ef-4fc2-ade5-81df5c9ccb70","Type":"ContainerDied","Data":"ed089ab7a66084c6ee1d3f53db736c2bda628e6ddf61257af7dba5046003f972"} Apr 17 14:38:25.580061 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.580035 2577 scope.go:117] "RemoveContainer" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" Apr 17 14:38:25.586775 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.586755 2577 scope.go:117] "RemoveContainer" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" Apr 17 14:38:25.592812 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.592794 2577 scope.go:117] "RemoveContainer" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" Apr 17 14:38:25.595237 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.595214 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:38:25.598411 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.598388 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:38:25.600016 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.599997 2577 scope.go:117] "RemoveContainer" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" Apr 17 14:38:25.608288 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.608268 2577 scope.go:117] "RemoveContainer" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" Apr 17 14:38:25.615165 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.615136 2577 scope.go:117] "RemoveContainer" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" Apr 17 14:38:25.621348 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.621328 2577 scope.go:117] "RemoveContainer" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" Apr 17 14:38:25.621739 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:38:25.621708 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": container with ID starting with 4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b not found: ID does not exist" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" Apr 17 14:38:25.621847 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.621749 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b"} err="failed to get container status \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": rpc error: code = NotFound desc = could not find container \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": container with ID starting with 4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b not found: ID does not exist" Apr 17 14:38:25.621847 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.621795 2577 scope.go:117] "RemoveContainer" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" Apr 17 14:38:25.622121 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:38:25.622101 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": container with ID starting with 9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770 not found: ID does not exist" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" Apr 17 14:38:25.622214 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.622129 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770"} err="failed to get container status \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": rpc error: code = NotFound desc = could not find container \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": container with ID starting with 9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770 not found: ID does not exist" Apr 17 14:38:25.622214 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.622169 2577 scope.go:117] "RemoveContainer" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" Apr 17 14:38:25.622453 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:38:25.622429 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": container with ID starting with 629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f not found: ID does not exist" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" Apr 17 14:38:25.622505 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.622459 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f"} err="failed to get container status \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": rpc error: code = NotFound desc = could not find container \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": container with ID starting with 629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f not found: ID does not exist" Apr 17 14:38:25.622505 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.622478 2577 scope.go:117] "RemoveContainer" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" Apr 17 14:38:25.622753 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:38:25.622728 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": container with ID starting with 83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10 not found: ID does not exist" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" Apr 17 14:38:25.622814 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.622764 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10"} err="failed to get container status \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": rpc error: code = NotFound desc = could not find container \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": container with ID starting with 83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10 not found: ID does not exist" Apr 17 14:38:25.622814 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.622785 2577 scope.go:117] "RemoveContainer" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" Apr 17 14:38:25.623048 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:38:25.623025 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": container with ID starting with 9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8 not found: ID does not exist" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" Apr 17 14:38:25.623099 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623055 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8"} err="failed to get container status \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": rpc error: code = NotFound desc = could not find container \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": container with ID starting with 9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8 not found: ID does not exist" Apr 17 14:38:25.623099 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623076 2577 scope.go:117] "RemoveContainer" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" Apr 17 14:38:25.623348 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:38:25.623300 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": container with ID starting with 05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a not found: ID does not exist" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" Apr 17 14:38:25.623435 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623352 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a"} err="failed to get container status \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": rpc error: code = NotFound desc = could not find container \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": container with ID starting with 05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a not found: ID does not exist" Apr 17 14:38:25.623435 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623366 2577 scope.go:117] "RemoveContainer" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" Apr 17 14:38:25.623522 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623496 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:38:25.623639 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:38:25.623623 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": container with ID starting with 1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5 not found: ID does not exist" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" Apr 17 14:38:25.623681 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623651 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5"} err="failed to get container status \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": rpc error: code = NotFound desc = could not find container \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": container with ID starting with 1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5 not found: ID does not exist" Apr 17 14:38:25.623681 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623670 2577 scope.go:117] "RemoveContainer" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" Apr 17 14:38:25.623771 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623760 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="prometheus" Apr 17 14:38:25.623826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623774 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="prometheus" Apr 17 14:38:25.623826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623788 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="config-reloader" Apr 17 14:38:25.623826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623794 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="config-reloader" Apr 17 14:38:25.623826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623804 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="init-config-reloader" Apr 17 14:38:25.623826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623809 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="init-config-reloader" Apr 17 14:38:25.623826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623816 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy-web" Apr 17 14:38:25.623826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623822 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy-web" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623843 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623851 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623864 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="thanos-sidecar" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623873 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="thanos-sidecar" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623882 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy-thanos" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623891 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy-thanos" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623914 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b"} err="failed to get container status \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": rpc error: code = NotFound desc = could not find container \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": container with ID starting with 4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b not found: ID does not exist" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623933 2577 scope.go:117] "RemoveContainer" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623945 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="thanos-sidecar" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623954 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="prometheus" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623960 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy-web" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623966 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="config-reloader" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623973 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy" Apr 17 14:38:25.624059 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.623981 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" containerName="kube-rbac-proxy-thanos" Apr 17 14:38:25.624543 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.624164 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770"} err="failed to get container status \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": rpc error: code = NotFound desc = could not find container \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": container with ID starting with 9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770 not found: ID does not exist" Apr 17 14:38:25.624543 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.624185 2577 scope.go:117] "RemoveContainer" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" Apr 17 14:38:25.624543 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.624401 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f"} err="failed to get container status \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": rpc error: code = NotFound desc = could not find container \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": container with ID starting with 629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f not found: ID does not exist" Apr 17 14:38:25.624543 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.624418 2577 scope.go:117] "RemoveContainer" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" Apr 17 14:38:25.624691 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.624620 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10"} err="failed to get container status \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": rpc error: code = NotFound desc = could not find container \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": container with ID starting with 83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10 not found: ID does not exist" Apr 17 14:38:25.624691 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.624636 2577 scope.go:117] "RemoveContainer" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" Apr 17 14:38:25.624853 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.624836 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8"} err="failed to get container status \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": rpc error: code = NotFound desc = could not find container \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": container with ID starting with 9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8 not found: ID does not exist" Apr 17 14:38:25.624901 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.624853 2577 scope.go:117] "RemoveContainer" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" Apr 17 14:38:25.625078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.625061 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a"} err="failed to get container status \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": rpc error: code = NotFound desc = could not find container \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": container with ID starting with 05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a not found: ID does not exist" Apr 17 14:38:25.625078 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.625078 2577 scope.go:117] "RemoveContainer" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" Apr 17 14:38:25.625313 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.625295 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5"} err="failed to get container status \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": rpc error: code = NotFound desc = could not find container \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": container with ID starting with 1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5 not found: ID does not exist" Apr 17 14:38:25.625388 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.625324 2577 scope.go:117] "RemoveContainer" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" Apr 17 14:38:25.625585 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.625547 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b"} err="failed to get container status \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": rpc error: code = NotFound desc = could not find container \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": container with ID starting with 4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b not found: ID does not exist" Apr 17 14:38:25.625585 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.625582 2577 scope.go:117] "RemoveContainer" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" Apr 17 14:38:25.625789 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.625762 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770"} err="failed to get container status \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": rpc error: code = NotFound desc = could not find container \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": container with ID starting with 9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770 not found: ID does not exist" Apr 17 14:38:25.625855 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.625791 2577 scope.go:117] "RemoveContainer" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" Apr 17 14:38:25.626028 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626011 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f"} err="failed to get container status \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": rpc error: code = NotFound desc = could not find container \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": container with ID starting with 629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f not found: ID does not exist" Apr 17 14:38:25.626076 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626028 2577 scope.go:117] "RemoveContainer" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" Apr 17 14:38:25.626248 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626229 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10"} err="failed to get container status \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": rpc error: code = NotFound desc = could not find container \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": container with ID starting with 83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10 not found: ID does not exist" Apr 17 14:38:25.626248 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626247 2577 scope.go:117] "RemoveContainer" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" Apr 17 14:38:25.626462 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626444 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8"} err="failed to get container status \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": rpc error: code = NotFound desc = could not find container \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": container with ID starting with 9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8 not found: ID does not exist" Apr 17 14:38:25.626528 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626464 2577 scope.go:117] "RemoveContainer" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" Apr 17 14:38:25.626680 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626662 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a"} err="failed to get container status \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": rpc error: code = NotFound desc = could not find container \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": container with ID starting with 05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a not found: ID does not exist" Apr 17 14:38:25.626726 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626680 2577 scope.go:117] "RemoveContainer" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" Apr 17 14:38:25.626880 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626862 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5"} err="failed to get container status \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": rpc error: code = NotFound desc = could not find container \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": container with ID starting with 1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5 not found: ID does not exist" Apr 17 14:38:25.626938 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.626881 2577 scope.go:117] "RemoveContainer" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" Apr 17 14:38:25.627082 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627065 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b"} err="failed to get container status \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": rpc error: code = NotFound desc = could not find container \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": container with ID starting with 4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b not found: ID does not exist" Apr 17 14:38:25.627142 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627083 2577 scope.go:117] "RemoveContainer" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" Apr 17 14:38:25.627362 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627343 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770"} err="failed to get container status \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": rpc error: code = NotFound desc = could not find container \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": container with ID starting with 9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770 not found: ID does not exist" Apr 17 14:38:25.627408 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627363 2577 scope.go:117] "RemoveContainer" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" Apr 17 14:38:25.627563 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627545 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f"} err="failed to get container status \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": rpc error: code = NotFound desc = could not find container \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": container with ID starting with 629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f not found: ID does not exist" Apr 17 14:38:25.627627 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627564 2577 scope.go:117] "RemoveContainer" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" Apr 17 14:38:25.627730 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627715 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10"} err="failed to get container status \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": rpc error: code = NotFound desc = could not find container \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": container with ID starting with 83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10 not found: ID does not exist" Apr 17 14:38:25.627785 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627731 2577 scope.go:117] "RemoveContainer" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" Apr 17 14:38:25.627939 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627921 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8"} err="failed to get container status \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": rpc error: code = NotFound desc = could not find container \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": container with ID starting with 9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8 not found: ID does not exist" Apr 17 14:38:25.627980 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.627940 2577 scope.go:117] "RemoveContainer" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" Apr 17 14:38:25.628163 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628134 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a"} err="failed to get container status \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": rpc error: code = NotFound desc = could not find container \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": container with ID starting with 05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a not found: ID does not exist" Apr 17 14:38:25.628209 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628167 2577 scope.go:117] "RemoveContainer" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" Apr 17 14:38:25.628361 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628345 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5"} err="failed to get container status \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": rpc error: code = NotFound desc = could not find container \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": container with ID starting with 1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5 not found: ID does not exist" Apr 17 14:38:25.628404 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628362 2577 scope.go:117] "RemoveContainer" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" Apr 17 14:38:25.628546 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628530 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b"} err="failed to get container status \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": rpc error: code = NotFound desc = could not find container \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": container with ID starting with 4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b not found: ID does not exist" Apr 17 14:38:25.628583 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628546 2577 scope.go:117] "RemoveContainer" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" Apr 17 14:38:25.628719 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628703 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770"} err="failed to get container status \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": rpc error: code = NotFound desc = could not find container \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": container with ID starting with 9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770 not found: ID does not exist" Apr 17 14:38:25.628719 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628717 2577 scope.go:117] "RemoveContainer" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" Apr 17 14:38:25.628887 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628872 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f"} err="failed to get container status \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": rpc error: code = NotFound desc = could not find container \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": container with ID starting with 629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f not found: ID does not exist" Apr 17 14:38:25.628887 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.628885 2577 scope.go:117] "RemoveContainer" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" Apr 17 14:38:25.629077 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629059 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10"} err="failed to get container status \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": rpc error: code = NotFound desc = could not find container \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": container with ID starting with 83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10 not found: ID does not exist" Apr 17 14:38:25.629125 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629076 2577 scope.go:117] "RemoveContainer" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" Apr 17 14:38:25.629323 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629305 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8"} err="failed to get container status \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": rpc error: code = NotFound desc = could not find container \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": container with ID starting with 9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8 not found: ID does not exist" Apr 17 14:38:25.629390 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629328 2577 scope.go:117] "RemoveContainer" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" Apr 17 14:38:25.629548 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629531 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a"} err="failed to get container status \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": rpc error: code = NotFound desc = could not find container \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": container with ID starting with 05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a not found: ID does not exist" Apr 17 14:38:25.629613 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629549 2577 scope.go:117] "RemoveContainer" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" Apr 17 14:38:25.629771 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629756 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5"} err="failed to get container status \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": rpc error: code = NotFound desc = could not find container \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": container with ID starting with 1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5 not found: ID does not exist" Apr 17 14:38:25.629814 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629773 2577 scope.go:117] "RemoveContainer" containerID="4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b" Apr 17 14:38:25.629964 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.629947 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.630050 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.630013 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b"} err="failed to get container status \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": rpc error: code = NotFound desc = could not find container \"4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b\": container with ID starting with 4817b9efafe8178f04b37bd5aa6d451cb3d97d30b0d1b88afd3ccd115f02343b not found: ID does not exist" Apr 17 14:38:25.630050 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.630030 2577 scope.go:117] "RemoveContainer" containerID="9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770" Apr 17 14:38:25.630290 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.630263 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770"} err="failed to get container status \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": rpc error: code = NotFound desc = could not find container \"9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770\": container with ID starting with 9cd0831c61563c11a7457e3c70b152aaf4f1701f3ef7f00a4c317ba3ea4a8770 not found: ID does not exist" Apr 17 14:38:25.630372 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.630293 2577 scope.go:117] "RemoveContainer" containerID="629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f" Apr 17 14:38:25.630621 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.630594 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f"} err="failed to get container status \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": rpc error: code = NotFound desc = could not find container \"629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f\": container with ID starting with 629fe07e90a9e7dff27bca59fd967bd7821a329b502591b03a21cc1b8a58db5f not found: ID does not exist" Apr 17 14:38:25.630697 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.630623 2577 scope.go:117] "RemoveContainer" containerID="83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10" Apr 17 14:38:25.630853 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.630834 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10"} err="failed to get container status \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": rpc error: code = NotFound desc = could not find container \"83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10\": container with ID starting with 83358955b7ed3e7d5753adc8c695a33b2cc01ba4a6f5511a34681119fe2ccc10 not found: ID does not exist" Apr 17 14:38:25.630909 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.630855 2577 scope.go:117] "RemoveContainer" containerID="9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8" Apr 17 14:38:25.631106 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.631084 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8"} err="failed to get container status \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": rpc error: code = NotFound desc = could not find container \"9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8\": container with ID starting with 9953db1a94638d9147c6ad946f781cac49cfc30938eb42cc73886e912beb67a8 not found: ID does not exist" Apr 17 14:38:25.631176 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.631107 2577 scope.go:117] "RemoveContainer" containerID="05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a" Apr 17 14:38:25.631433 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.631415 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a"} err="failed to get container status \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": rpc error: code = NotFound desc = could not find container \"05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a\": container with ID starting with 05f177caef793210b5fb4ef985d163fee62ced65e0057b6f40ab9f1234a1a57a not found: ID does not exist" Apr 17 14:38:25.631499 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.631432 2577 scope.go:117] "RemoveContainer" containerID="1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5" Apr 17 14:38:25.631648 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.631632 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5"} err="failed to get container status \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": rpc error: code = NotFound desc = could not find container \"1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5\": container with ID starting with 1d3f136c19077086291923d7084aaf1049162611d2ef8873511ff4e81fc8ded5 not found: ID does not exist" Apr 17 14:38:25.634519 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.634497 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 14:38:25.634598 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.634553 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 14:38:25.634598 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.634580 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 14:38:25.635052 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.634915 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 14:38:25.635052 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.635004 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 14:38:25.635052 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.635049 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-c1pivc3p2ncpj\"" Apr 17 14:38:25.635331 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.635077 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 14:38:25.635331 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.635009 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 14:38:25.635331 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.635131 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 14:38:25.635486 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.635419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-dqltf\"" Apr 17 14:38:25.635486 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.635419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 14:38:25.635486 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.635466 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 14:38:25.647770 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.647750 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 14:38:25.651043 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.651024 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:38:25.660424 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.660400 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 14:38:25.770537 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770537 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770799 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770799 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770613 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-web-config\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770799 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770799 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770799 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770799 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-config\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770799 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8879eb04-ef63-43a9-8d31-74a573a4ede8-config-out\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.770799 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.771217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8879eb04-ef63-43a9-8d31-74a573a4ede8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.771217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.771217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770943 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.771217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.770974 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.771217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.771014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.771217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.771053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btnzk\" (UniqueName: \"kubernetes.io/projected/8879eb04-ef63-43a9-8d31-74a573a4ede8-kube-api-access-btnzk\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.771217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.771126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.771217 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.771144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.799747 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.799713 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81332ce6-34ef-4fc2-ade5-81df5c9ccb70" path="/var/lib/kubelet/pods/81332ce6-34ef-4fc2-ade5-81df5c9ccb70/volumes" Apr 17 14:38:25.872214 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8879eb04-ef63-43a9-8d31-74a573a4ede8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872214 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872214 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872448 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872448 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872448 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btnzk\" (UniqueName: \"kubernetes.io/projected/8879eb04-ef63-43a9-8d31-74a573a4ede8-kube-api-access-btnzk\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872448 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872808 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872888 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872952 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.872952 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.873066 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.873066 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.872989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-web-config\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875042 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.873647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875042 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.874302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875042 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.873904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875042 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.874475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875042 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.874537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-config\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875042 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.874592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8879eb04-ef63-43a9-8d31-74a573a4ede8-config-out\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875042 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.874633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875477 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.875041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875477 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.875438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8879eb04-ef63-43a9-8d31-74a573a4ede8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.875910 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.875885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.876105 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.876085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.876905 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.876854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.877306 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.877092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.877306 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.877117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.879627 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.878286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8879eb04-ef63-43a9-8d31-74a573a4ede8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.879627 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.879606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.879772 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.879642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.879844 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.879818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-web-config\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.880044 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.879919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.880044 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.879986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8879eb04-ef63-43a9-8d31-74a573a4ede8-config-out\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.880044 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.879995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.882444 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.882423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btnzk\" (UniqueName: \"kubernetes.io/projected/8879eb04-ef63-43a9-8d31-74a573a4ede8-kube-api-access-btnzk\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.883211 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.883191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8879eb04-ef63-43a9-8d31-74a573a4ede8-config\") pod \"prometheus-k8s-0\" (UID: \"8879eb04-ef63-43a9-8d31-74a573a4ede8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:25.939681 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:25.939639 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:26.066641 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:26.066567 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:38:26.069186 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:38:26.069121 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8879eb04_ef63_43a9_8d31_74a573a4ede8.slice/crio-636c16ec11d82db4dd5d13e7f2f37a05f2b0a34c728ad931a3cc75561d9accce WatchSource:0}: Error finding container 636c16ec11d82db4dd5d13e7f2f37a05f2b0a34c728ad931a3cc75561d9accce: Status 404 returned error can't find the container with id 636c16ec11d82db4dd5d13e7f2f37a05f2b0a34c728ad931a3cc75561d9accce Apr 17 14:38:26.576844 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:26.576809 2577 generic.go:358] "Generic (PLEG): container finished" podID="8879eb04-ef63-43a9-8d31-74a573a4ede8" containerID="53ae4bf1c5b4a0a2058ae62804102a25ce7473fd08cf94e6ace970699d588ba7" exitCode=0 Apr 17 14:38:26.577041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:26.576884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8879eb04-ef63-43a9-8d31-74a573a4ede8","Type":"ContainerDied","Data":"53ae4bf1c5b4a0a2058ae62804102a25ce7473fd08cf94e6ace970699d588ba7"} Apr 17 14:38:26.577041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:26.576909 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8879eb04-ef63-43a9-8d31-74a573a4ede8","Type":"ContainerStarted","Data":"636c16ec11d82db4dd5d13e7f2f37a05f2b0a34c728ad931a3cc75561d9accce"} Apr 17 14:38:27.583422 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:27.583392 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8879eb04-ef63-43a9-8d31-74a573a4ede8","Type":"ContainerStarted","Data":"554493b03561e1033e65afa041945d142d20b28d05197c0bd5d4d5e75e07af15"} Apr 17 14:38:27.583422 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:27.583428 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8879eb04-ef63-43a9-8d31-74a573a4ede8","Type":"ContainerStarted","Data":"0e526d55ae2df25eca8c0cc656e1ae96ed29e5f7774492d0c2ee0ccc8dea0316"} Apr 17 14:38:27.583422 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:27.583441 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8879eb04-ef63-43a9-8d31-74a573a4ede8","Type":"ContainerStarted","Data":"3293b7fc8ba1588c93502fdb41493a77114d8ab533b2a50da12da427ea30165c"} Apr 17 14:38:27.583915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:27.583453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8879eb04-ef63-43a9-8d31-74a573a4ede8","Type":"ContainerStarted","Data":"513cddcff8ba4b9cc21c2b3192820202d46587022523490307f0548f9e4834de"} Apr 17 14:38:27.583915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:27.583463 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8879eb04-ef63-43a9-8d31-74a573a4ede8","Type":"ContainerStarted","Data":"576de530fb029094c638003e3e3657f8b263619a7540eaad441645afe6665395"} Apr 17 14:38:27.583915 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:27.583473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8879eb04-ef63-43a9-8d31-74a573a4ede8","Type":"ContainerStarted","Data":"55e8bfabcfef2e858be7b6567c344f8e65e0f47f8e41b7e1086d63e4b29f356d"} Apr 17 14:38:27.609164 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:27.609106 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.6090881379999997 podStartE2EDuration="2.609088138s" podCreationTimestamp="2026-04-17 14:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:38:27.606531625 +0000 UTC m=+276.343759637" watchObservedRunningTime="2026-04-17 14:38:27.609088138 +0000 UTC m=+276.346316140" Apr 17 14:38:30.940243 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:30.940211 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:38:51.752062 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:38:51.752037 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:39:25.940733 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:39:25.940689 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:39:25.956121 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:39:25.956093 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:39:26.762418 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:39:26.762388 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:41:56.295843 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.295808 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf"] Apr 17 14:41:56.298947 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.298930 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.301511 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.301486 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 14:41:56.301936 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.301917 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 14:41:56.301936 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.301930 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 14:41:56.302046 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.301964 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-bfw6w\"" Apr 17 14:41:56.302082 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.302062 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 14:41:56.322409 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.322388 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf"] Apr 17 14:41:56.444088 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.444053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.444310 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.444115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.444310 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.444198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfjd\" (UniqueName: \"kubernetes.io/projected/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-kube-api-access-qgfjd\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.545469 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.545432 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.545664 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.545480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfjd\" (UniqueName: \"kubernetes.io/projected/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-kube-api-access-qgfjd\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.545664 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.545548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.547992 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.547938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.547992 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.547965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.561231 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.561202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfjd\" (UniqueName: \"kubernetes.io/projected/2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e-kube-api-access-qgfjd\") pod \"opendatahub-operator-controller-manager-6c585549bc-bvxnf\" (UID: \"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.609249 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.609219 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:41:56.728298 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.728272 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf"] Apr 17 14:41:56.731187 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:41:56.731142 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a73dbbe_9bfd_433b_b2b3_3ef1439e3b4e.slice/crio-740e13ec19072a9ed85337a916e0f4a939355bc3d73e8c2b6f507c263656ce18 WatchSource:0}: Error finding container 740e13ec19072a9ed85337a916e0f4a939355bc3d73e8c2b6f507c263656ce18: Status 404 returned error can't find the container with id 740e13ec19072a9ed85337a916e0f4a939355bc3d73e8c2b6f507c263656ce18 Apr 17 14:41:56.732910 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:56.732891 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:41:57.144941 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:41:57.144908 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" event={"ID":"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e","Type":"ContainerStarted","Data":"740e13ec19072a9ed85337a916e0f4a939355bc3d73e8c2b6f507c263656ce18"} Apr 17 14:42:00.154694 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:00.154648 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" event={"ID":"2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e","Type":"ContainerStarted","Data":"69c43f0c3d9ac351315b86a91afeb33424f324d25003ea4b207b2520deacab40"} Apr 17 14:42:00.155180 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:00.154765 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:42:00.176793 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:00.176747 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" podStartSLOduration=1.759443862 podStartE2EDuration="4.176735333s" podCreationTimestamp="2026-04-17 14:41:56 +0000 UTC" firstStartedPulling="2026-04-17 14:41:56.733047787 +0000 UTC m=+485.470275767" lastFinishedPulling="2026-04-17 14:41:59.150339257 +0000 UTC m=+487.887567238" observedRunningTime="2026-04-17 14:42:00.176616333 +0000 UTC m=+488.913844347" watchObservedRunningTime="2026-04-17 14:42:00.176735333 +0000 UTC m=+488.913963335" Apr 17 14:42:11.160133 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:11.160100 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-bvxnf" Apr 17 14:42:13.886876 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.886839 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-596bc867c-8h8gh"] Apr 17 14:42:13.893253 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.893222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:13.895478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.895453 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 14:42:13.896298 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.896282 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-hdxjw\"" Apr 17 14:42:13.896298 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.896286 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 14:42:13.905096 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.905074 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-596bc867c-8h8gh"] Apr 17 14:42:13.994294 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.994258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b87dcf59-e699-4e0b-8923-2a544ff7daee-tmp\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:13.994465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.994326 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcznl\" (UniqueName: \"kubernetes.io/projected/b87dcf59-e699-4e0b-8923-2a544ff7daee-kube-api-access-tcznl\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:13.994465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:13.994373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b87dcf59-e699-4e0b-8923-2a544ff7daee-tls-certs\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:14.095578 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:14.095546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b87dcf59-e699-4e0b-8923-2a544ff7daee-tls-certs\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:14.095752 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:14.095602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b87dcf59-e699-4e0b-8923-2a544ff7daee-tmp\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:14.095752 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:14.095638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcznl\" (UniqueName: \"kubernetes.io/projected/b87dcf59-e699-4e0b-8923-2a544ff7daee-kube-api-access-tcznl\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:14.098082 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:14.098048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b87dcf59-e699-4e0b-8923-2a544ff7daee-tmp\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:14.098242 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:14.098214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b87dcf59-e699-4e0b-8923-2a544ff7daee-tls-certs\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:14.103208 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:14.103180 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcznl\" (UniqueName: \"kubernetes.io/projected/b87dcf59-e699-4e0b-8923-2a544ff7daee-kube-api-access-tcznl\") pod \"kube-auth-proxy-596bc867c-8h8gh\" (UID: \"b87dcf59-e699-4e0b-8923-2a544ff7daee\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:14.202703 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:14.202631 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" Apr 17 14:42:14.317478 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:14.317443 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-596bc867c-8h8gh"] Apr 17 14:42:14.320620 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:42:14.320591 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb87dcf59_e699_4e0b_8923_2a544ff7daee.slice/crio-e4257a6f6dcbbe8255bdcec2c468ef22ed4924a46ccb908d4accac2b4c0cbaff WatchSource:0}: Error finding container e4257a6f6dcbbe8255bdcec2c468ef22ed4924a46ccb908d4accac2b4c0cbaff: Status 404 returned error can't find the container with id e4257a6f6dcbbe8255bdcec2c468ef22ed4924a46ccb908d4accac2b4c0cbaff Apr 17 14:42:15.200824 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:15.200777 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" event={"ID":"b87dcf59-e699-4e0b-8923-2a544ff7daee","Type":"ContainerStarted","Data":"e4257a6f6dcbbe8255bdcec2c468ef22ed4924a46ccb908d4accac2b4c0cbaff"} Apr 17 14:42:19.212143 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:19.212102 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" event={"ID":"b87dcf59-e699-4e0b-8923-2a544ff7daee","Type":"ContainerStarted","Data":"203bd3ed7f412b41c32de1c2b9ee716f7b6a670b1bd0c6843fdcb5c67ae92c63"} Apr 17 14:42:19.244259 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:19.244216 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-596bc867c-8h8gh" podStartSLOduration=2.213821549 podStartE2EDuration="6.244197993s" podCreationTimestamp="2026-04-17 14:42:13 +0000 UTC" firstStartedPulling="2026-04-17 14:42:14.322715439 +0000 UTC m=+503.059943418" lastFinishedPulling="2026-04-17 14:42:18.353091868 +0000 UTC m=+507.090319862" observedRunningTime="2026-04-17 14:42:19.243747082 +0000 UTC m=+507.980975083" watchObservedRunningTime="2026-04-17 14:42:19.244197993 +0000 UTC m=+507.981425997" Apr 17 14:42:24.261803 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.261768 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mgflz"] Apr 17 14:42:24.264452 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.264433 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:24.266579 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.266559 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 14:42:24.266770 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.266751 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-4glfg\"" Apr 17 14:42:24.274221 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.274198 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mgflz"] Apr 17 14:42:24.279888 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.279853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2183913-a604-4568-9192-4bf2e28b1c08-cert\") pod \"kserve-controller-manager-856948b99f-mgflz\" (UID: \"b2183913-a604-4568-9192-4bf2e28b1c08\") " pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:24.280010 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.279900 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-788pc\" (UniqueName: \"kubernetes.io/projected/b2183913-a604-4568-9192-4bf2e28b1c08-kube-api-access-788pc\") pod \"kserve-controller-manager-856948b99f-mgflz\" (UID: \"b2183913-a604-4568-9192-4bf2e28b1c08\") " pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:24.380725 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.380702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2183913-a604-4568-9192-4bf2e28b1c08-cert\") pod \"kserve-controller-manager-856948b99f-mgflz\" (UID: \"b2183913-a604-4568-9192-4bf2e28b1c08\") " pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:24.380853 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.380732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-788pc\" (UniqueName: \"kubernetes.io/projected/b2183913-a604-4568-9192-4bf2e28b1c08-kube-api-access-788pc\") pod \"kserve-controller-manager-856948b99f-mgflz\" (UID: \"b2183913-a604-4568-9192-4bf2e28b1c08\") " pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:24.380894 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:42:24.380858 2577 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 14:42:24.380954 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:42:24.380943 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2183913-a604-4568-9192-4bf2e28b1c08-cert podName:b2183913-a604-4568-9192-4bf2e28b1c08 nodeName:}" failed. No retries permitted until 2026-04-17 14:42:24.880920382 +0000 UTC m=+513.618148365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2183913-a604-4568-9192-4bf2e28b1c08-cert") pod "kserve-controller-manager-856948b99f-mgflz" (UID: "b2183913-a604-4568-9192-4bf2e28b1c08") : secret "kserve-webhook-server-cert" not found Apr 17 14:42:24.389480 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.389459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-788pc\" (UniqueName: \"kubernetes.io/projected/b2183913-a604-4568-9192-4bf2e28b1c08-kube-api-access-788pc\") pod \"kserve-controller-manager-856948b99f-mgflz\" (UID: \"b2183913-a604-4568-9192-4bf2e28b1c08\") " pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:24.884205 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.884147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2183913-a604-4568-9192-4bf2e28b1c08-cert\") pod \"kserve-controller-manager-856948b99f-mgflz\" (UID: \"b2183913-a604-4568-9192-4bf2e28b1c08\") " pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:24.886493 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:24.886473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2183913-a604-4568-9192-4bf2e28b1c08-cert\") pod \"kserve-controller-manager-856948b99f-mgflz\" (UID: \"b2183913-a604-4568-9192-4bf2e28b1c08\") " pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:25.177136 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:25.177055 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:25.294737 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:42:25.294697 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2183913_a604_4568_9192_4bf2e28b1c08.slice/crio-ebaf602a60375ab399bf5ef8318475fc3b42c922f6c3076b84cfddaeaad7e154 WatchSource:0}: Error finding container ebaf602a60375ab399bf5ef8318475fc3b42c922f6c3076b84cfddaeaad7e154: Status 404 returned error can't find the container with id ebaf602a60375ab399bf5ef8318475fc3b42c922f6c3076b84cfddaeaad7e154 Apr 17 14:42:25.295073 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:25.295050 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mgflz"] Apr 17 14:42:26.232782 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:26.232745 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" event={"ID":"b2183913-a604-4568-9192-4bf2e28b1c08","Type":"ContainerStarted","Data":"ebaf602a60375ab399bf5ef8318475fc3b42c922f6c3076b84cfddaeaad7e154"} Apr 17 14:42:28.240270 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.240187 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" event={"ID":"b2183913-a604-4568-9192-4bf2e28b1c08","Type":"ContainerStarted","Data":"2063a3f38e140b87de7d5654c7a68dbc7819f1dd29ce0253581aa4488f9af6b3"} Apr 17 14:42:28.240615 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.240305 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:42:28.262395 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.261939 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" podStartSLOduration=1.5804821150000001 podStartE2EDuration="4.261925631s" podCreationTimestamp="2026-04-17 14:42:24 +0000 UTC" firstStartedPulling="2026-04-17 14:42:25.295947022 +0000 UTC m=+514.033175001" lastFinishedPulling="2026-04-17 14:42:27.977390524 +0000 UTC m=+516.714618517" observedRunningTime="2026-04-17 14:42:28.261720181 +0000 UTC m=+516.998948192" watchObservedRunningTime="2026-04-17 14:42:28.261925631 +0000 UTC m=+516.999153633" Apr 17 14:42:28.824193 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.823548 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9"] Apr 17 14:42:28.827000 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.826975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:28.829409 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.829379 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 14:42:28.829409 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.829385 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 14:42:28.829409 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.829409 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-8kbqg\"" Apr 17 14:42:28.838303 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.838282 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9"] Apr 17 14:42:28.916317 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.916296 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-954vn\" (UniqueName: \"kubernetes.io/projected/d32dff17-eefa-4274-9af7-e38d9f43fbc9-kube-api-access-954vn\") pod \"servicemesh-operator3-55f49c5f94-qgjc9\" (UID: \"d32dff17-eefa-4274-9af7-e38d9f43fbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:28.916429 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:28.916344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d32dff17-eefa-4274-9af7-e38d9f43fbc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-qgjc9\" (UID: \"d32dff17-eefa-4274-9af7-e38d9f43fbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:29.017290 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:29.017265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d32dff17-eefa-4274-9af7-e38d9f43fbc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-qgjc9\" (UID: \"d32dff17-eefa-4274-9af7-e38d9f43fbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:29.017371 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:29.017321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-954vn\" (UniqueName: \"kubernetes.io/projected/d32dff17-eefa-4274-9af7-e38d9f43fbc9-kube-api-access-954vn\") pod \"servicemesh-operator3-55f49c5f94-qgjc9\" (UID: \"d32dff17-eefa-4274-9af7-e38d9f43fbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:29.019615 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:29.019588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d32dff17-eefa-4274-9af7-e38d9f43fbc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-qgjc9\" (UID: \"d32dff17-eefa-4274-9af7-e38d9f43fbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:29.025259 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:29.025239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-954vn\" (UniqueName: \"kubernetes.io/projected/d32dff17-eefa-4274-9af7-e38d9f43fbc9-kube-api-access-954vn\") pod \"servicemesh-operator3-55f49c5f94-qgjc9\" (UID: \"d32dff17-eefa-4274-9af7-e38d9f43fbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:29.136026 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:29.135967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:29.270550 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:29.270530 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9"] Apr 17 14:42:29.273921 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:42:29.273892 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd32dff17_eefa_4274_9af7_e38d9f43fbc9.slice/crio-9709362df8ae7c351c4219cdd167c1f435dac8941f368e6e5c695c180602ecd1 WatchSource:0}: Error finding container 9709362df8ae7c351c4219cdd167c1f435dac8941f368e6e5c695c180602ecd1: Status 404 returned error can't find the container with id 9709362df8ae7c351c4219cdd167c1f435dac8941f368e6e5c695c180602ecd1 Apr 17 14:42:30.247792 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:30.247757 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" event={"ID":"d32dff17-eefa-4274-9af7-e38d9f43fbc9","Type":"ContainerStarted","Data":"9709362df8ae7c351c4219cdd167c1f435dac8941f368e6e5c695c180602ecd1"} Apr 17 14:42:32.255493 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:32.255461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" event={"ID":"d32dff17-eefa-4274-9af7-e38d9f43fbc9","Type":"ContainerStarted","Data":"7d448315b3962e7206d469576f13f6a372311ad7d1c9558241dd854fdcd89ea1"} Apr 17 14:42:32.255847 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:32.255618 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:32.282037 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:32.281985 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" podStartSLOduration=1.995850862 podStartE2EDuration="4.281968272s" podCreationTimestamp="2026-04-17 14:42:28 +0000 UTC" firstStartedPulling="2026-04-17 14:42:29.277010123 +0000 UTC m=+518.014238103" lastFinishedPulling="2026-04-17 14:42:31.563127535 +0000 UTC m=+520.300355513" observedRunningTime="2026-04-17 14:42:32.280384133 +0000 UTC m=+521.017612136" watchObservedRunningTime="2026-04-17 14:42:32.281968272 +0000 UTC m=+521.019196275" Apr 17 14:42:43.260663 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:43.260589 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qgjc9" Apr 17 14:42:44.716907 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.716873 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2"] Apr 17 14:42:44.719805 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.719787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.723393 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.723367 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 14:42:44.723522 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.723430 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 14:42:44.724333 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.724318 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-7bdzn\"" Apr 17 14:42:44.724400 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.724321 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 14:42:44.725874 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.725854 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 14:42:44.734296 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.734275 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2"] Apr 17 14:42:44.860504 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.860470 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.860504 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.860505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6m7\" (UniqueName: \"kubernetes.io/projected/0cd0bba8-6306-4ba8-bb06-88442e230f9d-kube-api-access-cn6m7\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.860703 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.860598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.860703 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.860633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cd0bba8-6306-4ba8-bb06-88442e230f9d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.860703 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.860660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.860703 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.860679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.860826 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.860710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.961678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.961638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.961678 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.961674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.961916 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.961698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.961916 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.961745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.961916 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.961770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6m7\" (UniqueName: \"kubernetes.io/projected/0cd0bba8-6306-4ba8-bb06-88442e230f9d-kube-api-access-cn6m7\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.962060 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.961938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.962060 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.961968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cd0bba8-6306-4ba8-bb06-88442e230f9d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.962595 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.962560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.964299 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.964266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.964433 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.964411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cd0bba8-6306-4ba8-bb06-88442e230f9d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.964594 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.964572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.964697 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.964683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/0cd0bba8-6306-4ba8-bb06-88442e230f9d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.969566 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.969510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cd0bba8-6306-4ba8-bb06-88442e230f9d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:44.969566 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:44.969543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6m7\" (UniqueName: \"kubernetes.io/projected/0cd0bba8-6306-4ba8-bb06-88442e230f9d-kube-api-access-cn6m7\") pod \"istiod-openshift-gateway-55ff986f96-rvcl2\" (UID: \"0cd0bba8-6306-4ba8-bb06-88442e230f9d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:45.029612 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:45.029574 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:45.188450 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:45.188425 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2"] Apr 17 14:42:45.190530 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:42:45.190500 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cd0bba8_6306_4ba8_bb06_88442e230f9d.slice/crio-08efaec8a056146c13fbc7093d86fea651ca9c541fca63f1159965a681e86f60 WatchSource:0}: Error finding container 08efaec8a056146c13fbc7093d86fea651ca9c541fca63f1159965a681e86f60: Status 404 returned error can't find the container with id 08efaec8a056146c13fbc7093d86fea651ca9c541fca63f1159965a681e86f60 Apr 17 14:42:45.295928 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:45.295894 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" event={"ID":"0cd0bba8-6306-4ba8-bb06-88442e230f9d","Type":"ContainerStarted","Data":"08efaec8a056146c13fbc7093d86fea651ca9c541fca63f1159965a681e86f60"} Apr 17 14:42:47.661359 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:47.661315 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 14:42:47.661636 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:47.661391 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 14:42:48.308013 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:48.307975 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" event={"ID":"0cd0bba8-6306-4ba8-bb06-88442e230f9d","Type":"ContainerStarted","Data":"07f39d59121cfe691be57a755a94115d16479a4e32d526062f4d3f880274e841"} Apr 17 14:42:48.308245 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:48.308115 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:48.309857 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:48.309830 2577 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-rvcl2 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 14:42:48.309983 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:48.309876 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" podUID="0cd0bba8-6306-4ba8-bb06-88442e230f9d" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:42:48.326278 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:48.326226 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" podStartSLOduration=1.857895057 podStartE2EDuration="4.326209566s" podCreationTimestamp="2026-04-17 14:42:44 +0000 UTC" firstStartedPulling="2026-04-17 14:42:45.192701911 +0000 UTC m=+533.929929897" lastFinishedPulling="2026-04-17 14:42:47.661016412 +0000 UTC m=+536.398244406" observedRunningTime="2026-04-17 14:42:48.325293247 +0000 UTC m=+537.062521247" watchObservedRunningTime="2026-04-17 14:42:48.326209566 +0000 UTC m=+537.063437568" Apr 17 14:42:49.311847 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:49.311817 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rvcl2" Apr 17 14:42:59.248879 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:42:59.248842 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-mgflz" Apr 17 14:43:46.191303 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.191271 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dndvm"] Apr 17 14:43:46.193648 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.193631 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-dndvm" Apr 17 14:43:46.196895 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.196868 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-prpg6\"" Apr 17 14:43:46.197591 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.197572 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 14:43:46.197696 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.197623 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 14:43:46.211229 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.211209 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dndvm"] Apr 17 14:43:46.382108 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.382070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdjnm\" (UniqueName: \"kubernetes.io/projected/1241cfaa-0e6c-4632-a9aa-606a1bd354c9-kube-api-access-gdjnm\") pod \"authorino-operator-657f44b778-dndvm\" (UID: \"1241cfaa-0e6c-4632-a9aa-606a1bd354c9\") " pod="kuadrant-system/authorino-operator-657f44b778-dndvm" Apr 17 14:43:46.482727 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.482645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdjnm\" (UniqueName: \"kubernetes.io/projected/1241cfaa-0e6c-4632-a9aa-606a1bd354c9-kube-api-access-gdjnm\") pod \"authorino-operator-657f44b778-dndvm\" (UID: \"1241cfaa-0e6c-4632-a9aa-606a1bd354c9\") " pod="kuadrant-system/authorino-operator-657f44b778-dndvm" Apr 17 14:43:46.491668 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.491633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdjnm\" (UniqueName: \"kubernetes.io/projected/1241cfaa-0e6c-4632-a9aa-606a1bd354c9-kube-api-access-gdjnm\") pod \"authorino-operator-657f44b778-dndvm\" (UID: \"1241cfaa-0e6c-4632-a9aa-606a1bd354c9\") " pod="kuadrant-system/authorino-operator-657f44b778-dndvm" Apr 17 14:43:46.503501 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.503464 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-dndvm" Apr 17 14:43:46.625188 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:46.625133 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dndvm"] Apr 17 14:43:46.627717 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:43:46.627689 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1241cfaa_0e6c_4632_a9aa_606a1bd354c9.slice/crio-a9f9da2d92ec8270d2a64ae10154ac1839c10ad048a2f13afc12dc61631e243f WatchSource:0}: Error finding container a9f9da2d92ec8270d2a64ae10154ac1839c10ad048a2f13afc12dc61631e243f: Status 404 returned error can't find the container with id a9f9da2d92ec8270d2a64ae10154ac1839c10ad048a2f13afc12dc61631e243f Apr 17 14:43:47.492692 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:47.492655 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-dndvm" event={"ID":"1241cfaa-0e6c-4632-a9aa-606a1bd354c9","Type":"ContainerStarted","Data":"a9f9da2d92ec8270d2a64ae10154ac1839c10ad048a2f13afc12dc61631e243f"} Apr 17 14:43:48.497083 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:48.497000 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-dndvm" event={"ID":"1241cfaa-0e6c-4632-a9aa-606a1bd354c9","Type":"ContainerStarted","Data":"708efdc843b02733356f7853684cf31709d6f5e23a40b8f344af1fb44194cea6"} Apr 17 14:43:48.497455 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:48.497143 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-dndvm" Apr 17 14:43:48.515311 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:48.515261 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-dndvm" podStartSLOduration=0.950806759 podStartE2EDuration="2.51524714s" podCreationTimestamp="2026-04-17 14:43:46 +0000 UTC" firstStartedPulling="2026-04-17 14:43:46.629761202 +0000 UTC m=+595.366989195" lastFinishedPulling="2026-04-17 14:43:48.194201588 +0000 UTC m=+596.931429576" observedRunningTime="2026-04-17 14:43:48.514172588 +0000 UTC m=+597.251400580" watchObservedRunningTime="2026-04-17 14:43:48.51524714 +0000 UTC m=+597.252475140" Apr 17 14:43:54.153257 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.153217 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr"] Apr 17 14:43:54.156336 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.156309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:43:54.159502 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.159469 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-nll7t\"" Apr 17 14:43:54.166717 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.166682 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr"] Apr 17 14:43:54.249148 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.249121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1db3726b-57eb-4526-9207-7b7c1c518a05-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" (UID: \"1db3726b-57eb-4526-9207-7b7c1c518a05\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:43:54.249298 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.249220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96r42\" (UniqueName: \"kubernetes.io/projected/1db3726b-57eb-4526-9207-7b7c1c518a05-kube-api-access-96r42\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" (UID: \"1db3726b-57eb-4526-9207-7b7c1c518a05\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:43:54.350293 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.350255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1db3726b-57eb-4526-9207-7b7c1c518a05-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" (UID: \"1db3726b-57eb-4526-9207-7b7c1c518a05\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:43:54.350474 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.350319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96r42\" (UniqueName: \"kubernetes.io/projected/1db3726b-57eb-4526-9207-7b7c1c518a05-kube-api-access-96r42\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" (UID: \"1db3726b-57eb-4526-9207-7b7c1c518a05\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:43:54.350650 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.350630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1db3726b-57eb-4526-9207-7b7c1c518a05-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" (UID: \"1db3726b-57eb-4526-9207-7b7c1c518a05\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:43:54.359636 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.359605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96r42\" (UniqueName: \"kubernetes.io/projected/1db3726b-57eb-4526-9207-7b7c1c518a05-kube-api-access-96r42\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" (UID: \"1db3726b-57eb-4526-9207-7b7c1c518a05\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:43:54.467797 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.467720 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:43:54.587576 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:54.587547 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr"] Apr 17 14:43:54.590386 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:43:54.590357 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1db3726b_57eb_4526_9207_7b7c1c518a05.slice/crio-9d565267da463a5207a6d2bf811cd41a11a27a8e643b28c4e0ec22e2e4051a84 WatchSource:0}: Error finding container 9d565267da463a5207a6d2bf811cd41a11a27a8e643b28c4e0ec22e2e4051a84: Status 404 returned error can't find the container with id 9d565267da463a5207a6d2bf811cd41a11a27a8e643b28c4e0ec22e2e4051a84 Apr 17 14:43:55.520733 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:55.520693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" event={"ID":"1db3726b-57eb-4526-9207-7b7c1c518a05","Type":"ContainerStarted","Data":"9d565267da463a5207a6d2bf811cd41a11a27a8e643b28c4e0ec22e2e4051a84"} Apr 17 14:43:59.503543 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:43:59.503507 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-dndvm" Apr 17 14:44:00.539355 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:00.539320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" event={"ID":"1db3726b-57eb-4526-9207-7b7c1c518a05","Type":"ContainerStarted","Data":"d417aa0f91eed448dc2b2a54abf3e7fa056ce8932ad28c0cc5fa7e6bf7825433"} Apr 17 14:44:00.539763 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:00.539451 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:44:00.558423 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:00.558366 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" podStartSLOduration=0.972494058 podStartE2EDuration="6.558347566s" podCreationTimestamp="2026-04-17 14:43:54 +0000 UTC" firstStartedPulling="2026-04-17 14:43:54.593345198 +0000 UTC m=+603.330573188" lastFinishedPulling="2026-04-17 14:44:00.179198713 +0000 UTC m=+608.916426696" observedRunningTime="2026-04-17 14:44:00.556917608 +0000 UTC m=+609.294145609" watchObservedRunningTime="2026-04-17 14:44:00.558347566 +0000 UTC m=+609.295575568" Apr 17 14:44:11.545895 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:11.545812 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:44:12.444920 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.444886 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr"] Apr 17 14:44:12.445143 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.445102 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" containerName="manager" containerID="cri-o://d417aa0f91eed448dc2b2a54abf3e7fa056ce8932ad28c0cc5fa7e6bf7825433" gracePeriod=2 Apr 17 14:44:12.450933 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.450903 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr"] Apr 17 14:44:12.474797 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.474760 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v"] Apr 17 14:44:12.475479 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.475455 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" containerName="manager" Apr 17 14:44:12.475479 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.475481 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" containerName="manager" Apr 17 14:44:12.475646 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.475639 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" containerName="manager" Apr 17 14:44:12.481473 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.481395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:12.483849 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.483816 2577 status_manager.go:895] "Failed to get status for pod" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" is forbidden: User \"system:node:ip-10-0-135-180.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-180.ec2.internal' and this object" Apr 17 14:44:12.489644 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.489616 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v"] Apr 17 14:44:12.580551 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.580520 2577 generic.go:358] "Generic (PLEG): container finished" podID="1db3726b-57eb-4526-9207-7b7c1c518a05" containerID="d417aa0f91eed448dc2b2a54abf3e7fa056ce8932ad28c0cc5fa7e6bf7825433" exitCode=0 Apr 17 14:44:12.606769 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.606740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaa55bca-e88d-413d-8854-e6789ce9579c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-sfp5v\" (UID: \"eaa55bca-e88d-413d-8854-e6789ce9579c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:12.606938 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.606783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57fvv\" (UniqueName: \"kubernetes.io/projected/eaa55bca-e88d-413d-8854-e6789ce9579c-kube-api-access-57fvv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-sfp5v\" (UID: \"eaa55bca-e88d-413d-8854-e6789ce9579c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:12.684438 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.684415 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:44:12.686479 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.686450 2577 status_manager.go:895] "Failed to get status for pod" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" is forbidden: User \"system:node:ip-10-0-135-180.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-180.ec2.internal' and this object" Apr 17 14:44:12.707952 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.707879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaa55bca-e88d-413d-8854-e6789ce9579c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-sfp5v\" (UID: \"eaa55bca-e88d-413d-8854-e6789ce9579c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:12.707952 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.707917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57fvv\" (UniqueName: \"kubernetes.io/projected/eaa55bca-e88d-413d-8854-e6789ce9579c-kube-api-access-57fvv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-sfp5v\" (UID: \"eaa55bca-e88d-413d-8854-e6789ce9579c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:12.708304 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.708282 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaa55bca-e88d-413d-8854-e6789ce9579c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-sfp5v\" (UID: \"eaa55bca-e88d-413d-8854-e6789ce9579c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:12.725003 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.724976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57fvv\" (UniqueName: \"kubernetes.io/projected/eaa55bca-e88d-413d-8854-e6789ce9579c-kube-api-access-57fvv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-sfp5v\" (UID: \"eaa55bca-e88d-413d-8854-e6789ce9579c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:12.808746 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.808709 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96r42\" (UniqueName: \"kubernetes.io/projected/1db3726b-57eb-4526-9207-7b7c1c518a05-kube-api-access-96r42\") pod \"1db3726b-57eb-4526-9207-7b7c1c518a05\" (UID: \"1db3726b-57eb-4526-9207-7b7c1c518a05\") " Apr 17 14:44:12.808911 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.808800 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1db3726b-57eb-4526-9207-7b7c1c518a05-extensions-socket-volume\") pod \"1db3726b-57eb-4526-9207-7b7c1c518a05\" (UID: \"1db3726b-57eb-4526-9207-7b7c1c518a05\") " Apr 17 14:44:12.809283 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.809259 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db3726b-57eb-4526-9207-7b7c1c518a05-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "1db3726b-57eb-4526-9207-7b7c1c518a05" (UID: "1db3726b-57eb-4526-9207-7b7c1c518a05"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:44:12.810890 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.810862 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db3726b-57eb-4526-9207-7b7c1c518a05-kube-api-access-96r42" (OuterVolumeSpecName: "kube-api-access-96r42") pod "1db3726b-57eb-4526-9207-7b7c1c518a05" (UID: "1db3726b-57eb-4526-9207-7b7c1c518a05"). InnerVolumeSpecName "kube-api-access-96r42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:44:12.835661 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.835638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:12.909696 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.909666 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96r42\" (UniqueName: \"kubernetes.io/projected/1db3726b-57eb-4526-9207-7b7c1c518a05-kube-api-access-96r42\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:44:12.909790 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.909699 2577 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1db3726b-57eb-4526-9207-7b7c1c518a05-extensions-socket-volume\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:44:12.958803 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:12.958777 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v"] Apr 17 14:44:12.961527 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:44:12.961502 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaa55bca_e88d_413d_8854_e6789ce9579c.slice/crio-f7dcea26f775d55483a6dac5ef408d4794d81d6c397830c23849faeb1c17a4fa WatchSource:0}: Error finding container f7dcea26f775d55483a6dac5ef408d4794d81d6c397830c23849faeb1c17a4fa: Status 404 returned error can't find the container with id f7dcea26f775d55483a6dac5ef408d4794d81d6c397830c23849faeb1c17a4fa Apr 17 14:44:13.585356 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.585322 2577 scope.go:117] "RemoveContainer" containerID="d417aa0f91eed448dc2b2a54abf3e7fa056ce8932ad28c0cc5fa7e6bf7825433" Apr 17 14:44:13.585787 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.585324 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" Apr 17 14:44:13.586985 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.586960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" event={"ID":"eaa55bca-e88d-413d-8854-e6789ce9579c","Type":"ContainerStarted","Data":"9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54"} Apr 17 14:44:13.587084 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.586993 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" event={"ID":"eaa55bca-e88d-413d-8854-e6789ce9579c","Type":"ContainerStarted","Data":"f7dcea26f775d55483a6dac5ef408d4794d81d6c397830c23849faeb1c17a4fa"} Apr 17 14:44:13.587084 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.587050 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:13.587836 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.587812 2577 status_manager.go:895] "Failed to get status for pod" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" is forbidden: User \"system:node:ip-10-0-135-180.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-180.ec2.internal' and this object" Apr 17 14:44:13.589501 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.589475 2577 status_manager.go:895] "Failed to get status for pod" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" is forbidden: User \"system:node:ip-10-0-135-180.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-180.ec2.internal' and this object" Apr 17 14:44:13.611116 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.611071 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" podStartSLOduration=1.61105962 podStartE2EDuration="1.61105962s" podCreationTimestamp="2026-04-17 14:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:44:13.609791492 +0000 UTC m=+622.347019505" watchObservedRunningTime="2026-04-17 14:44:13.61105962 +0000 UTC m=+622.348287621" Apr 17 14:44:13.611793 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.611771 2577 status_manager.go:895] "Failed to get status for pod" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5d2wr" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5d2wr\" is forbidden: User \"system:node:ip-10-0-135-180.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-180.ec2.internal' and this object" Apr 17 14:44:13.801756 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:13.801720 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db3726b-57eb-4526-9207-7b7c1c518a05" path="/var/lib/kubelet/pods/1db3726b-57eb-4526-9207-7b7c1c518a05/volumes" Apr 17 14:44:24.592846 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:24.592811 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:28.929698 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:28.929662 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v"] Apr 17 14:44:28.935795 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:28.929973 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" podUID="eaa55bca-e88d-413d-8854-e6789ce9579c" containerName="manager" containerID="cri-o://9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54" gracePeriod=10 Apr 17 14:44:29.967518 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:29.967494 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:30.057380 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.057291 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57fvv\" (UniqueName: \"kubernetes.io/projected/eaa55bca-e88d-413d-8854-e6789ce9579c-kube-api-access-57fvv\") pod \"eaa55bca-e88d-413d-8854-e6789ce9579c\" (UID: \"eaa55bca-e88d-413d-8854-e6789ce9579c\") " Apr 17 14:44:30.057380 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.057359 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaa55bca-e88d-413d-8854-e6789ce9579c-extensions-socket-volume\") pod \"eaa55bca-e88d-413d-8854-e6789ce9579c\" (UID: \"eaa55bca-e88d-413d-8854-e6789ce9579c\") " Apr 17 14:44:30.057775 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.057747 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaa55bca-e88d-413d-8854-e6789ce9579c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "eaa55bca-e88d-413d-8854-e6789ce9579c" (UID: "eaa55bca-e88d-413d-8854-e6789ce9579c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:44:30.059542 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.059522 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa55bca-e88d-413d-8854-e6789ce9579c-kube-api-access-57fvv" (OuterVolumeSpecName: "kube-api-access-57fvv") pod "eaa55bca-e88d-413d-8854-e6789ce9579c" (UID: "eaa55bca-e88d-413d-8854-e6789ce9579c"). InnerVolumeSpecName "kube-api-access-57fvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:44:30.158797 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.158767 2577 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaa55bca-e88d-413d-8854-e6789ce9579c-extensions-socket-volume\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:44:30.158797 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.158792 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57fvv\" (UniqueName: \"kubernetes.io/projected/eaa55bca-e88d-413d-8854-e6789ce9579c-kube-api-access-57fvv\") on node \"ip-10-0-135-180.ec2.internal\" DevicePath \"\"" Apr 17 14:44:30.645269 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.645232 2577 generic.go:358] "Generic (PLEG): container finished" podID="eaa55bca-e88d-413d-8854-e6789ce9579c" containerID="9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54" exitCode=0 Apr 17 14:44:30.645431 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.645292 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" Apr 17 14:44:30.645431 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.645305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" event={"ID":"eaa55bca-e88d-413d-8854-e6789ce9579c","Type":"ContainerDied","Data":"9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54"} Apr 17 14:44:30.645431 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.645335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v" event={"ID":"eaa55bca-e88d-413d-8854-e6789ce9579c","Type":"ContainerDied","Data":"f7dcea26f775d55483a6dac5ef408d4794d81d6c397830c23849faeb1c17a4fa"} Apr 17 14:44:30.645431 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.645352 2577 scope.go:117] "RemoveContainer" containerID="9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54" Apr 17 14:44:30.653531 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.653509 2577 scope.go:117] "RemoveContainer" containerID="9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54" Apr 17 14:44:30.653794 ip-10-0-135-180 kubenswrapper[2577]: E0417 14:44:30.653774 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54\": container with ID starting with 9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54 not found: ID does not exist" containerID="9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54" Apr 17 14:44:30.653853 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.653803 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54"} err="failed to get container status \"9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54\": rpc error: code = NotFound desc = could not find container \"9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54\": container with ID starting with 9d364dd4d1244afc63131ba4da8416fc34e28860fc6559aa20d8b80da840eb54 not found: ID does not exist" Apr 17 14:44:30.667216 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.667191 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v"] Apr 17 14:44:30.670935 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:30.670913 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-sfp5v"] Apr 17 14:44:31.799774 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:31.799741 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa55bca-e88d-413d-8854-e6789ce9579c" path="/var/lib/kubelet/pods/eaa55bca-e88d-413d-8854-e6789ce9579c/volumes" Apr 17 14:44:45.119447 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.119408 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4"] Apr 17 14:44:45.120029 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.119719 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eaa55bca-e88d-413d-8854-e6789ce9579c" containerName="manager" Apr 17 14:44:45.120029 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.119730 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa55bca-e88d-413d-8854-e6789ce9579c" containerName="manager" Apr 17 14:44:45.120029 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.119787 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="eaa55bca-e88d-413d-8854-e6789ce9579c" containerName="manager" Apr 17 14:44:45.122087 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.122058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.124312 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.124291 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-dpw46\"" Apr 17 14:44:45.149244 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.149199 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4"] Apr 17 14:44:45.184484 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.184642 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184504 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.184642 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.184642 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxjn7\" (UniqueName: \"kubernetes.io/projected/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-kube-api-access-vxjn7\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.184642 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.184903 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.184903 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184752 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.184903 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.184903 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.184840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.285650 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.285650 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285651 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.285837 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.285837 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.285902 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.285902 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.285982 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.285982 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxjn7\" (UniqueName: \"kubernetes.io/projected/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-kube-api-access-vxjn7\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.286080 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.285987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.286132 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.286092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.286216 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.286182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.286405 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.286361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.286505 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.286404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.286505 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.286466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.288352 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.288328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.288780 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.288753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.295767 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.295737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxjn7\" (UniqueName: \"kubernetes.io/projected/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-kube-api-access-vxjn7\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.295989 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.295968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5c5fb49f-7118-4132-8ffb-5dfed76e4b1f-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-6zjm4\" (UID: \"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.448993 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.448906 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:45.581845 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.581813 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4"] Apr 17 14:44:45.584579 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:44:45.584551 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c5fb49f_7118_4132_8ffb_5dfed76e4b1f.slice/crio-514fc270b8194e46631ba968bb68a85cda8c5cede81cf34f2f8be8d628d4b0fb WatchSource:0}: Error finding container 514fc270b8194e46631ba968bb68a85cda8c5cede81cf34f2f8be8d628d4b0fb: Status 404 returned error can't find the container with id 514fc270b8194e46631ba968bb68a85cda8c5cede81cf34f2f8be8d628d4b0fb Apr 17 14:44:45.694609 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:45.694574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" event={"ID":"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f","Type":"ContainerStarted","Data":"514fc270b8194e46631ba968bb68a85cda8c5cede81cf34f2f8be8d628d4b0fb"} Apr 17 14:44:47.826190 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:47.826137 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 14:44:47.826442 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:47.826227 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 14:44:47.826442 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:47.826256 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 14:44:48.706367 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:48.706332 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" event={"ID":"5c5fb49f-7118-4132-8ffb-5dfed76e4b1f","Type":"ContainerStarted","Data":"e750d7402058fb49035f0fc0651fb2f593bdad42d7047a0677c6dad159869b79"} Apr 17 14:44:48.724717 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:48.724668 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" podStartSLOduration=1.485618276 podStartE2EDuration="3.724652729s" podCreationTimestamp="2026-04-17 14:44:45 +0000 UTC" firstStartedPulling="2026-04-17 14:44:45.586835156 +0000 UTC m=+654.324063136" lastFinishedPulling="2026-04-17 14:44:47.825869593 +0000 UTC m=+656.563097589" observedRunningTime="2026-04-17 14:44:48.723723363 +0000 UTC m=+657.460951375" watchObservedRunningTime="2026-04-17 14:44:48.724652729 +0000 UTC m=+657.461880732" Apr 17 14:44:49.445605 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.445560 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:44:49.447806 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.447789 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:44:49.449302 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.449279 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:49.450258 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.450230 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-f6hxm\"" Apr 17 14:44:49.450258 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.450233 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 14:44:49.454556 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.453948 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:49.458007 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.457985 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:44:49.499749 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.499716 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:44:49.525303 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.525273 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/758cdcf0-6623-484f-b692-b5cbcd622171-config-file\") pod \"limitador-limitador-78c99df468-r2zmq\" (UID: \"758cdcf0-6623-484f-b692-b5cbcd622171\") " pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:44:49.525424 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.525350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tps2b\" (UniqueName: \"kubernetes.io/projected/758cdcf0-6623-484f-b692-b5cbcd622171-kube-api-access-tps2b\") pod \"limitador-limitador-78c99df468-r2zmq\" (UID: \"758cdcf0-6623-484f-b692-b5cbcd622171\") " pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:44:49.626514 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.626479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tps2b\" (UniqueName: \"kubernetes.io/projected/758cdcf0-6623-484f-b692-b5cbcd622171-kube-api-access-tps2b\") pod \"limitador-limitador-78c99df468-r2zmq\" (UID: \"758cdcf0-6623-484f-b692-b5cbcd622171\") " pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:44:49.626671 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.626547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/758cdcf0-6623-484f-b692-b5cbcd622171-config-file\") pod \"limitador-limitador-78c99df468-r2zmq\" (UID: \"758cdcf0-6623-484f-b692-b5cbcd622171\") " pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:44:49.627204 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.627181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/758cdcf0-6623-484f-b692-b5cbcd622171-config-file\") pod \"limitador-limitador-78c99df468-r2zmq\" (UID: \"758cdcf0-6623-484f-b692-b5cbcd622171\") " pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:44:49.633912 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.633885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tps2b\" (UniqueName: \"kubernetes.io/projected/758cdcf0-6623-484f-b692-b5cbcd622171-kube-api-access-tps2b\") pod \"limitador-limitador-78c99df468-r2zmq\" (UID: \"758cdcf0-6623-484f-b692-b5cbcd622171\") " pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:44:49.710354 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.710279 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:49.711321 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.711300 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-6zjm4" Apr 17 14:44:49.762916 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.762883 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:44:49.885359 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:49.885263 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:44:49.888083 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:44:49.888052 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758cdcf0_6623_484f_b692_b5cbcd622171.slice/crio-a0a8db77854f3f4e617e374d58daded39473e246adbc058eb92371514e2af103 WatchSource:0}: Error finding container a0a8db77854f3f4e617e374d58daded39473e246adbc058eb92371514e2af103: Status 404 returned error can't find the container with id a0a8db77854f3f4e617e374d58daded39473e246adbc058eb92371514e2af103 Apr 17 14:44:50.727600 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:50.727564 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" event={"ID":"758cdcf0-6623-484f-b692-b5cbcd622171","Type":"ContainerStarted","Data":"a0a8db77854f3f4e617e374d58daded39473e246adbc058eb92371514e2af103"} Apr 17 14:44:52.738856 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:52.738775 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" event={"ID":"758cdcf0-6623-484f-b692-b5cbcd622171","Type":"ContainerStarted","Data":"4547f7c61639191b926801c531e0795bd8e9ba79b3bfb9b73949e11339d224fd"} Apr 17 14:44:52.739255 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:44:52.738946 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:45:03.743518 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:45:03.743487 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" Apr 17 14:45:03.759575 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:45:03.759525 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-r2zmq" podStartSLOduration=12.204345101 podStartE2EDuration="14.75951212s" podCreationTimestamp="2026-04-17 14:44:49 +0000 UTC" firstStartedPulling="2026-04-17 14:44:49.8899615 +0000 UTC m=+658.627189479" lastFinishedPulling="2026-04-17 14:44:52.445128515 +0000 UTC m=+661.182356498" observedRunningTime="2026-04-17 14:44:52.770002966 +0000 UTC m=+661.507230971" watchObservedRunningTime="2026-04-17 14:45:03.75951212 +0000 UTC m=+672.496740121" Apr 17 14:45:19.276882 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:45:19.276846 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:46:00.141558 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:46:00.141479 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:46:04.123074 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:46:04.123038 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:46:07.110947 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:46:07.110911 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:46:25.916269 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:46:25.916237 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:46:35.614033 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:46:35.613992 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:46:57.714080 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:46:57.714037 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:47:23.111502 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:47:23.111419 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r2zmq"] Apr 17 14:50:54.145481 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:50:54.145388 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-mgflz_b2183913-a604-4568-9192-4bf2e28b1c08/manager/0.log" Apr 17 14:50:54.589602 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:50:54.589556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c585549bc-bvxnf_2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e/manager/0.log" Apr 17 14:50:56.325868 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:50:56.325843 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-dndvm_1241cfaa-0e6c-4632-a9aa-606a1bd354c9/manager/0.log" Apr 17 14:50:56.884135 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:50:56.884101 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-r2zmq_758cdcf0-6623-484f-b692-b5cbcd622171/limitador/0.log" Apr 17 14:50:57.425205 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:50:57.425169 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rvcl2_0cd0bba8-6306-4ba8-bb06-88442e230f9d/discovery/0.log" Apr 17 14:50:57.637270 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:50:57.637236 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-596bc867c-8h8gh_b87dcf59-e699-4e0b-8923-2a544ff7daee/kube-auth-proxy/0.log" Apr 17 14:50:57.747185 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:50:57.747094 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-6zjm4_5c5fb49f-7118-4132-8ffb-5dfed76e4b1f/istio-proxy/0.log" Apr 17 14:50:57.853215 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:50:57.853183 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6d757c5ffb-lfbbq_f6c83797-bcd2-4b74-a591-ecde3243182e/router/0.log" Apr 17 14:51:05.344864 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:05.344831 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-r2ggk_8cd56fcc-a291-4dce-ab6e-46a29c54ed43/global-pull-secret-syncer/0.log" Apr 17 14:51:05.495413 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:05.495376 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7vwcw_a095d6e7-582d-469a-8e0f-5f14ef60b4bf/konnectivity-agent/0.log" Apr 17 14:51:05.559710 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:05.559679 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-180.ec2.internal_6caffc9b9af5358bace19aa9382d0587/haproxy/0.log" Apr 17 14:51:09.516483 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:09.516449 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-dndvm_1241cfaa-0e6c-4632-a9aa-606a1bd354c9/manager/0.log" Apr 17 14:51:09.663862 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:09.663835 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-r2zmq_758cdcf0-6623-484f-b692-b5cbcd622171/limitador/0.log" Apr 17 14:51:11.255803 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.255775 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-b7fd7b587-d4d47_24fe5dfb-a1f2-45df-8eb5-c03383ef1f51/metrics-server/0.log" Apr 17 14:51:11.282497 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.282470 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rf2lm_c7a7eeed-14fd-46f7-aa37-49304a1e871a/monitoring-plugin/0.log" Apr 17 14:51:11.319397 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.319325 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9nqpt_044d54bf-e77d-422e-b7ca-a2f4a003b98a/node-exporter/0.log" Apr 17 14:51:11.346480 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.346426 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9nqpt_044d54bf-e77d-422e-b7ca-a2f4a003b98a/kube-rbac-proxy/0.log" Apr 17 14:51:11.369573 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.369541 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9nqpt_044d54bf-e77d-422e-b7ca-a2f4a003b98a/init-textfile/0.log" Apr 17 14:51:11.620959 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.620909 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8879eb04-ef63-43a9-8d31-74a573a4ede8/prometheus/0.log" Apr 17 14:51:11.643769 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.643742 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8879eb04-ef63-43a9-8d31-74a573a4ede8/config-reloader/0.log" Apr 17 14:51:11.665309 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.665282 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8879eb04-ef63-43a9-8d31-74a573a4ede8/thanos-sidecar/0.log" Apr 17 14:51:11.693310 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.693262 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8879eb04-ef63-43a9-8d31-74a573a4ede8/kube-rbac-proxy-web/0.log" Apr 17 14:51:11.714497 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.714453 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8879eb04-ef63-43a9-8d31-74a573a4ede8/kube-rbac-proxy/0.log" Apr 17 14:51:11.735148 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.735120 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8879eb04-ef63-43a9-8d31-74a573a4ede8/kube-rbac-proxy-thanos/0.log" Apr 17 14:51:11.756836 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:11.756796 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8879eb04-ef63-43a9-8d31-74a573a4ede8/init-config-reloader/0.log" Apr 17 14:51:13.989341 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:13.989305 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b"] Apr 17 14:51:13.991777 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:13.991759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:13.995125 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:13.995102 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fvwkz\"/\"openshift-service-ca.crt\"" Apr 17 14:51:13.995125 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:13.995102 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fvwkz\"/\"kube-root-ca.crt\"" Apr 17 14:51:13.995860 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:13.995844 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fvwkz\"/\"default-dockercfg-p84cm\"" Apr 17 14:51:14.001819 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.001799 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b"] Apr 17 14:51:14.055475 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.055436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-podres\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.055649 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.055479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-proc\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.055649 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.055537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-sys\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.055771 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.055647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-lib-modules\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.055771 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.055672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwjpg\" (UniqueName: \"kubernetes.io/projected/7bd7582b-2f52-4ad0-8638-7a0db24814a3-kube-api-access-bwjpg\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.156936 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.156896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-lib-modules\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.156936 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.156941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwjpg\" (UniqueName: \"kubernetes.io/projected/7bd7582b-2f52-4ad0-8638-7a0db24814a3-kube-api-access-bwjpg\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.157196 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.157001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-podres\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.157196 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.157059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-proc\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.157196 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.157088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-sys\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.157196 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.157094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-lib-modules\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.157196 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.157170 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-sys\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.157400 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.157193 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-proc\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.157400 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.157211 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bd7582b-2f52-4ad0-8638-7a0db24814a3-podres\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.165209 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.165186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwjpg\" (UniqueName: \"kubernetes.io/projected/7bd7582b-2f52-4ad0-8638-7a0db24814a3-kube-api-access-bwjpg\") pod \"perf-node-gather-daemonset-dkx4b\" (UID: \"7bd7582b-2f52-4ad0-8638-7a0db24814a3\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.302017 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.301983 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:14.419020 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.418996 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b"] Apr 17 14:51:14.421124 ip-10-0-135-180 kubenswrapper[2577]: W0417 14:51:14.421094 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7bd7582b_2f52_4ad0_8638_7a0db24814a3.slice/crio-a026addc58fa30429445306f76d711beae81a38c6ccbbe608cdf10814f80ea33 WatchSource:0}: Error finding container a026addc58fa30429445306f76d711beae81a38c6ccbbe608cdf10814f80ea33: Status 404 returned error can't find the container with id a026addc58fa30429445306f76d711beae81a38c6ccbbe608cdf10814f80ea33 Apr 17 14:51:14.422693 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:14.422675 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:51:15.007751 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:15.007710 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" event={"ID":"7bd7582b-2f52-4ad0-8638-7a0db24814a3","Type":"ContainerStarted","Data":"009d99901e59354f5eb5dd1fa2f8c3564587e3605584b60de37ad4dbf74d9277"} Apr 17 14:51:15.007751 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:15.007756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" event={"ID":"7bd7582b-2f52-4ad0-8638-7a0db24814a3","Type":"ContainerStarted","Data":"a026addc58fa30429445306f76d711beae81a38c6ccbbe608cdf10814f80ea33"} Apr 17 14:51:15.008274 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:15.007784 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:15.024869 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:15.024818 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" podStartSLOduration=2.024804262 podStartE2EDuration="2.024804262s" podCreationTimestamp="2026-04-17 14:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:51:15.023043239 +0000 UTC m=+1043.760271240" watchObservedRunningTime="2026-04-17 14:51:15.024804262 +0000 UTC m=+1043.762032262" Apr 17 14:51:15.567816 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:15.567786 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t99cr_fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0/dns/0.log" Apr 17 14:51:15.587239 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:15.587213 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t99cr_fce4b3ea-fa1f-49ea-9006-8df35ffaaaf0/kube-rbac-proxy/0.log" Apr 17 14:51:15.695574 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:15.695548 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v95rv_d058fa6d-b2cb-4092-b4f9-49dee965fff3/dns-node-resolver/0.log" Apr 17 14:51:16.129673 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:16.129635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7dff97d6d-4f999_6a63267e-0738-4e6d-bbe4-4cdb4beb3002/registry/0.log" Apr 17 14:51:16.198332 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:16.198305 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tf94b_b72db081-842b-4ea0-a24a-f1f7916d7ee5/node-ca/0.log" Apr 17 14:51:17.139703 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:17.139670 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rvcl2_0cd0bba8-6306-4ba8-bb06-88442e230f9d/discovery/0.log" Apr 17 14:51:17.182546 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:17.182515 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-596bc867c-8h8gh_b87dcf59-e699-4e0b-8923-2a544ff7daee/kube-auth-proxy/0.log" Apr 17 14:51:17.210825 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:17.210792 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-6zjm4_5c5fb49f-7118-4132-8ffb-5dfed76e4b1f/istio-proxy/0.log" Apr 17 14:51:17.233976 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:17.233949 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6d757c5ffb-lfbbq_f6c83797-bcd2-4b74-a591-ecde3243182e/router/0.log" Apr 17 14:51:17.755071 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:17.755043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r25f2_df3539d9-3c50-4eb7-8ff9-72a06eb7b7ed/serve-healthcheck-canary/0.log" Apr 17 14:51:18.189976 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:18.189946 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5bjx6_72340448-e6b1-4975-8a57-ec81d546fee4/kube-rbac-proxy/0.log" Apr 17 14:51:18.212730 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:18.212705 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5bjx6_72340448-e6b1-4975-8a57-ec81d546fee4/exporter/0.log" Apr 17 14:51:18.234953 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:18.234927 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5bjx6_72340448-e6b1-4975-8a57-ec81d546fee4/extractor/0.log" Apr 17 14:51:20.217240 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:20.217205 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-mgflz_b2183913-a604-4568-9192-4bf2e28b1c08/manager/0.log" Apr 17 14:51:20.315740 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:20.315711 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c585549bc-bvxnf_2a73dbbe-9bfd-433b-b2b3-3ef1439e3b4e/manager/0.log" Apr 17 14:51:21.021238 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:21.021212 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-dkx4b" Apr 17 14:51:27.167547 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.167514 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49c8v_47a7cca6-6a79-404f-9826-84de279713bb/kube-multus-additional-cni-plugins/0.log" Apr 17 14:51:27.190560 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.190523 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49c8v_47a7cca6-6a79-404f-9826-84de279713bb/egress-router-binary-copy/0.log" Apr 17 14:51:27.211216 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.211185 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49c8v_47a7cca6-6a79-404f-9826-84de279713bb/cni-plugins/0.log" Apr 17 14:51:27.231096 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.231073 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49c8v_47a7cca6-6a79-404f-9826-84de279713bb/bond-cni-plugin/0.log" Apr 17 14:51:27.251697 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.251671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49c8v_47a7cca6-6a79-404f-9826-84de279713bb/routeoverride-cni/0.log" Apr 17 14:51:27.273005 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.272980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49c8v_47a7cca6-6a79-404f-9826-84de279713bb/whereabouts-cni-bincopy/0.log" Apr 17 14:51:27.292423 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.292395 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49c8v_47a7cca6-6a79-404f-9826-84de279713bb/whereabouts-cni/0.log" Apr 17 14:51:27.668707 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.668677 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sr7t4_1625a2e5-b33d-4e5d-85f4-286dd067f854/kube-multus/0.log" Apr 17 14:51:27.731013 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.730990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4nvfl_a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87/network-metrics-daemon/0.log" Apr 17 14:51:27.750350 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:27.750325 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4nvfl_a8b3b9c0-3f2e-4dbe-b825-c2fe73140a87/kube-rbac-proxy/0.log" Apr 17 14:51:29.092446 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:29.092411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbxb9_96429a18-321a-48f1-8136-1ce5473e98ee/ovn-controller/0.log" Apr 17 14:51:29.117169 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:29.117127 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbxb9_96429a18-321a-48f1-8136-1ce5473e98ee/ovn-acl-logging/0.log" Apr 17 14:51:29.139041 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:29.139013 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbxb9_96429a18-321a-48f1-8136-1ce5473e98ee/kube-rbac-proxy-node/0.log" Apr 17 14:51:29.162362 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:29.162332 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbxb9_96429a18-321a-48f1-8136-1ce5473e98ee/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 14:51:29.179697 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:29.179666 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbxb9_96429a18-321a-48f1-8136-1ce5473e98ee/northd/0.log" Apr 17 14:51:29.199465 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:29.199438 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbxb9_96429a18-321a-48f1-8136-1ce5473e98ee/nbdb/0.log" Apr 17 14:51:29.221335 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:29.221307 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbxb9_96429a18-321a-48f1-8136-1ce5473e98ee/sbdb/0.log" Apr 17 14:51:29.318606 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:29.318575 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbxb9_96429a18-321a-48f1-8136-1ce5473e98ee/ovnkube-controller/0.log" Apr 17 14:51:30.444870 ip-10-0-135-180 kubenswrapper[2577]: I0417 14:51:30.444836 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-v8n74_5b856db3-62df-499f-acb9-ca566ed0be77/network-check-target-container/0.log"