Apr 17 20:47:45.436928 ip-10-0-130-66 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 20:47:45.436933 ip-10-0-130-66 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 20:47:45.436940 ip-10-0-130-66 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 20:47:45.437281 ip-10-0-130-66 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 20:47:55.502055 ip-10-0-130-66 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 20:47:55.502077 ip-10-0-130-66 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e29b52e6ab8a41f69591d7318496a56c -- Apr 17 20:50:15.659691 ip-10-0-130-66 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:50:16.051012 ip-10-0-130-66 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:16.051012 ip-10-0-130-66 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:50:16.051012 ip-10-0-130-66 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:16.051012 ip-10-0-130-66 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:50:16.051012 ip-10-0-130-66 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:16.053124 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.053026 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:50:16.055298 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055275 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:16.055298 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055293 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:16.055298 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055298 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055306 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055313 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055317 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055321 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055325 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055328 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055332 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055336 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055340 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055345 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055349 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055376 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055380 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055384 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055388 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055392 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055396 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055400 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055404 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:16.055522 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055408 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055412 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055416 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055420 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055423 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055427 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055431 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055435 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055439 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055443 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055447 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055451 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055455 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055459 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055464 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055469 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055474 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055478 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055482 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055486 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:16.056399 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055490 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055499 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055505 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055509 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055513 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055517 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055521 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055525 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055529 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055533 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055538 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055542 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055546 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055550 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055554 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055558 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055562 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055566 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055571 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:16.057091 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055575 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055579 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055583 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055587 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055592 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055597 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055601 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055606 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055611 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055615 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055619 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055624 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055627 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055632 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055635 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055640 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055644 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055650 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055654 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055658 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:16.057593 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055662 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055666 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055670 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055675 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.055679 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056314 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056325 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056329 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056333 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056337 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056341 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056346 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056351 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056375 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056380 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056384 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056389 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056395 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056400 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056404 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:16.058186 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056408 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056412 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056417 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056423 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056441 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056446 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056451 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056455 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056461 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056465 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056469 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056474 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056478 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056482 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056486 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056493 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056499 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056504 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056509 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056514 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:16.059057 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056518 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056522 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056527 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056533 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056537 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056542 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056547 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056551 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056556 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056560 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056566 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056570 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056574 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056578 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056582 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056586 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056590 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056594 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056598 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056602 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:16.059780 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056606 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056610 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056615 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056620 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056624 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056628 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056631 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056636 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056640 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056643 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056648 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056652 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056656 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056660 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056664 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056668 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056672 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056676 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056681 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056685 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:16.060428 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056689 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056693 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056697 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056702 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056708 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056712 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056716 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056720 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056723 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056727 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.056732 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.057956 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.057974 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.057986 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.057994 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058001 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058006 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058012 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058020 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058025 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:50:16.061114 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058030 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058037 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058042 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058047 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058051 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058057 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058062 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058066 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058071 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058076 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058083 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058088 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058093 2576 flags.go:64] FLAG: --config-dir="" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058098 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058103 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058110 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058123 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058129 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058134 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058139 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058143 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058148 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058154 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058159 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058175 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:50:16.061644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058180 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058185 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058189 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058194 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058199 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058207 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058212 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058216 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058222 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058227 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058233 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058238 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058243 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058248 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058252 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058257 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058262 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058268 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058273 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058277 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058281 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058288 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058293 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058298 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058305 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058310 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:50:16.062340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058315 2576 flags.go:64] FLAG: --help="false" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058320 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058326 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058330 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058335 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058340 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058346 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058350 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058372 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058376 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058381 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058386 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058391 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058395 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058400 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058404 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058408 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058413 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058418 2576 flags.go:64] FLAG: --lock-file="" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058423 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058427 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058432 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058441 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:50:16.063017 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058446 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058451 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058456 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058461 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058467 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058471 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058476 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058483 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058488 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058495 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058500 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058505 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058509 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058514 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058519 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058523 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058528 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058541 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058545 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058551 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058557 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058561 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058570 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058575 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:50:16.063597 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058580 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058585 2576 flags.go:64] FLAG: --port="10250" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058590 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058594 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04831a03e5a993d40" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058599 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058604 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058616 2576 flags.go:64] FLAG: --register-node="true" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058621 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058626 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058633 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058638 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058643 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058647 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058653 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058658 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058663 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058668 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058674 2576 flags.go:64] FLAG: --runonce="false" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058678 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058683 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058688 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058693 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058697 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058702 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058707 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058712 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:50:16.064172 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058717 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058722 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058726 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058731 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058736 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058740 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058745 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058754 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058759 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058764 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058770 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058774 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058778 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058783 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058788 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058793 2576 flags.go:64] FLAG: --v="2" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058800 2576 flags.go:64] FLAG: --version="false" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058806 2576 flags.go:64] FLAG: --vmodule="" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058813 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.058818 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.058986 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.058993 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.058998 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059003 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:16.064821 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059015 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059021 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059026 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059030 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059035 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059039 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059043 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059047 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059051 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059055 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059059 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059063 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059067 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059071 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059074 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059078 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059082 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059110 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059117 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:16.065500 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059128 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059134 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059138 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059143 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059148 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059152 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059156 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059160 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059167 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059172 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059176 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059180 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059184 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059188 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059193 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059197 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059201 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059205 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059209 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059214 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:16.066014 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059218 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059222 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059226 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059230 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059234 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059240 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059246 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059250 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059254 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059258 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059262 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059266 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059272 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059276 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059280 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059284 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059288 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059292 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059296 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059300 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:16.066530 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059306 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059310 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059314 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059318 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059322 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059326 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059330 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059334 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059338 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059343 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059347 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059370 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059375 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059380 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059384 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059388 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059392 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059396 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059401 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059404 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:16.067023 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059408 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059412 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.059416 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.059862 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.067072 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.067088 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067147 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067152 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067156 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067159 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067162 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067164 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067167 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067169 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067172 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:16.067533 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067174 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067177 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067179 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067182 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067186 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067189 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067192 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067195 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067197 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067200 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067202 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067207 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067211 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067214 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067217 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067220 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067223 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067225 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067228 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:16.067900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067230 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067233 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067236 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067240 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067244 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067248 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067250 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067253 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067256 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067259 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067261 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067264 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067266 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067269 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067271 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067274 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067277 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067280 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067283 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067286 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:16.068380 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067288 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067291 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067293 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067296 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067298 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067301 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067303 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067306 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067308 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067311 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067313 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067315 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067318 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067321 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067323 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067326 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067329 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067332 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067334 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067337 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:16.068951 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067339 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067342 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067344 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067347 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067349 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067352 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067374 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067377 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067380 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067382 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067385 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067389 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067392 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067394 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067397 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067399 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067402 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:16.069516 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067404 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.067410 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067525 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067530 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067534 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067537 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067540 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067543 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067546 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067549 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067552 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067555 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067558 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067562 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067565 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067568 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:16.069942 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067570 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067573 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067575 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067578 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067580 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067583 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067586 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067588 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067591 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067593 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067596 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067606 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067608 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067611 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067613 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067616 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067618 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067620 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067623 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:16.070351 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067627 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067631 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067634 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067637 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067639 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067642 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067645 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067648 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067651 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067653 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067656 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067659 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067663 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067666 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067668 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067671 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067673 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067676 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067679 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:16.070849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067681 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067684 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067686 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067688 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067691 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067693 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067701 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067704 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067706 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067709 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067711 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067714 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067716 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067719 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067722 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067724 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067727 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067730 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067733 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067735 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:16.071303 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067738 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067740 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067743 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067745 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067748 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067750 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067752 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067755 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067757 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067760 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067762 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067765 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067767 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:16.067770 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.067774 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:16.071809 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.068317 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:50:16.072179 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.070241 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:50:16.072179 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.071068 2576 server.go:1019] "Starting client certificate rotation" Apr 17 20:50:16.072179 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.071165 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:50:16.072179 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.071203 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:50:16.096697 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.096672 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:50:16.101347 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.101323 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:50:16.114124 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.114100 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:50:16.118864 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.118847 2576 log.go:25] "Validated CRI v1 image API" Apr 17 20:50:16.120620 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.120595 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:50:16.122598 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.122576 2576 fs.go:135] Filesystem UUIDs: map[29ad09bd-67bc-4888-a10f-b6a32364ca6f:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 7d0e2439-ceeb-4558-9a27-ec2fa72c8167:/dev/nvme0n1p4] Apr 17 20:50:16.122698 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.122596 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:50:16.125958 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.125941 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:50:16.127785 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.127667 2576 manager.go:217] Machine: {Timestamp:2026-04-17 20:50:16.126652814 +0000 UTC m=+0.361119212 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101118 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ea168d57b0a305bf1e846f4f1c2a1 SystemUUID:ec2ea168-d57b-0a30-5bf1-e846f4f1c2a1 BootID:e29b52e6-ab8a-41f6-9591-d7318496a56c Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:54:c4:32:79:71 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:54:c4:32:79:71 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:5f:dd:fd:b1:23 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:50:16.128259 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.128249 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:50:16.128399 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.128386 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:50:16.130579 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.130400 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:50:16.130794 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.130584 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-66.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:50:16.130845 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.130805 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:50:16.130845 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.130815 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:50:16.130845 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.130828 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:50:16.132048 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.132037 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:50:16.132693 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.132684 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:50:16.132810 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.132800 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:50:16.134985 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.134975 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:50:16.135019 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.134989 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:50:16.135019 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.135005 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:50:16.135019 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.135013 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:50:16.135116 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.135021 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:50:16.135991 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.135979 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:50:16.136045 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.135998 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:50:16.138555 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.138538 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:50:16.139794 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.139781 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:50:16.141338 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141326 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:50:16.141408 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141344 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:50:16.141408 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141350 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:50:16.141408 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141373 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:50:16.141408 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141382 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:50:16.141408 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141388 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:50:16.141408 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141407 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:50:16.141571 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141413 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:50:16.141571 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141420 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:50:16.141571 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141426 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:50:16.141571 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141434 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:50:16.141571 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.141442 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:50:16.142331 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.142322 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:50:16.142378 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.142332 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:50:16.146117 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.146099 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:50:16.146216 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.146142 2576 server.go:1295] "Started kubelet" Apr 17 20:50:16.146266 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.146238 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:50:16.146311 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.146297 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-66.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:50:16.146394 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.146342 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:50:16.146435 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.146407 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:50:16.146544 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.146515 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-66.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:50:16.146544 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.146516 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:50:16.147099 ip-10-0-130-66 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:50:16.148581 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.148566 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:50:16.150282 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.150267 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:50:16.154162 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.154137 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:50:16.154511 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.154497 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:50:16.155187 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155173 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:50:16.155295 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155276 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:50:16.155411 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155393 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:50:16.155475 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155455 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:50:16.155475 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155463 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:50:16.155626 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155609 2576 factory.go:55] Registering systemd factory Apr 17 20:50:16.155899 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155630 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:50:16.155899 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155895 2576 factory.go:153] Registering CRI-O factory Apr 17 20:50:16.156011 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155912 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 20:50:16.156011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.155918 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:50:16.156011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.154875 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-66.ec2.internal.18a74010276dd496 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-66.ec2.internal,UID:ip-10-0-130-66.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-66.ec2.internal,},FirstTimestamp:2026-04-17 20:50:16.146113686 +0000 UTC m=+0.380580086,LastTimestamp:2026-04-17 20:50:16.146113686 +0000 UTC m=+0.380580086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-66.ec2.internal,}" Apr 17 20:50:16.156011 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.155979 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:50:16.156011 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.156006 2576 factory.go:103] Registering Raw factory Apr 17 20:50:16.156251 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.156023 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 20:50:16.156334 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.155619 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.156509 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.156495 2576 manager.go:319] Starting recovery of all containers Apr 17 20:50:16.159213 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.159184 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-66.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 20:50:16.159579 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.159464 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 20:50:16.165531 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.165506 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hzgzh" Apr 17 20:50:16.169248 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.169228 2576 manager.go:324] Recovery completed Apr 17 20:50:16.170309 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.170290 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hzgzh" Apr 17 20:50:16.173934 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.173921 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:16.176539 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.176523 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:16.176606 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.176551 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:16.176606 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.176562 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:16.176967 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.176956 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:50:16.176967 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.176966 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:50:16.177045 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.176980 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:50:16.178245 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.178183 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-66.ec2.internal.18a74010293e1236 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-66.ec2.internal,UID:ip-10-0-130-66.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-66.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-66.ec2.internal,},FirstTimestamp:2026-04-17 20:50:16.176538166 +0000 UTC m=+0.411004563,LastTimestamp:2026-04-17 20:50:16.176538166 +0000 UTC m=+0.411004563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-66.ec2.internal,}" Apr 17 20:50:16.180055 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.180042 2576 policy_none.go:49] "None policy: Start" Apr 17 20:50:16.180106 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.180059 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:50:16.180106 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.180069 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:50:16.221980 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.221965 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.222005 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.222015 2576 server.go:85] "Starting device plugin registration server" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.222233 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.222243 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.222390 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.222486 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.222498 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.222920 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.222957 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.233562 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.234895 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.234922 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.234944 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.234952 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.235027 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:50:16.249807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.236940 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:16.322667 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.322576 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:16.323511 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.323493 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:16.323608 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.323525 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:16.323608 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.323540 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:16.323608 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.323568 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.329762 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.329744 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.329866 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.329771 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-66.ec2.internal\": node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.335384 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.335337 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal"] Apr 17 20:50:16.335467 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.335415 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:16.337023 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.337006 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:16.337121 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.337037 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:16.337121 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.337050 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:16.339378 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.339346 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:16.339500 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.339483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.339546 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.339517 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:16.340112 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.340091 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:16.340209 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.340125 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:16.340209 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.340132 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:16.340209 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.340138 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:16.340209 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.340156 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:16.340209 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.340170 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:16.342486 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.342470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.342579 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.342497 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:16.343382 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.343350 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:16.343466 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.343399 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:16.343466 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.343413 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:16.349042 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.349024 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.355901 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.355878 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-66.ec2.internal\" not found" node="ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.356505 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.356488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b8513526e5cd2b3c1dca895a37fc635-config\") pod \"kube-apiserver-proxy-ip-10-0-130-66.ec2.internal\" (UID: \"3b8513526e5cd2b3c1dca895a37fc635\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.356572 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.356514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c59ba66ee4d7ecaead53b509d5e7b5e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal\" (UID: \"c59ba66ee4d7ecaead53b509d5e7b5e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.356572 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.356533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c59ba66ee4d7ecaead53b509d5e7b5e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal\" (UID: \"c59ba66ee4d7ecaead53b509d5e7b5e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.359803 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.359781 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-66.ec2.internal\" not found" node="ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.449553 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.449516 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.456854 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.456825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b8513526e5cd2b3c1dca895a37fc635-config\") pod \"kube-apiserver-proxy-ip-10-0-130-66.ec2.internal\" (UID: \"3b8513526e5cd2b3c1dca895a37fc635\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.456968 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.456861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c59ba66ee4d7ecaead53b509d5e7b5e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal\" (UID: \"c59ba66ee4d7ecaead53b509d5e7b5e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.456968 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.456885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c59ba66ee4d7ecaead53b509d5e7b5e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal\" (UID: \"c59ba66ee4d7ecaead53b509d5e7b5e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.456968 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.456920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b8513526e5cd2b3c1dca895a37fc635-config\") pod \"kube-apiserver-proxy-ip-10-0-130-66.ec2.internal\" (UID: \"3b8513526e5cd2b3c1dca895a37fc635\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.456968 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.456944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c59ba66ee4d7ecaead53b509d5e7b5e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal\" (UID: \"c59ba66ee4d7ecaead53b509d5e7b5e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.457103 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.456964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c59ba66ee4d7ecaead53b509d5e7b5e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal\" (UID: \"c59ba66ee4d7ecaead53b509d5e7b5e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.550641 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.550586 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.651462 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.651374 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.657704 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.657688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.662188 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:16.662168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" Apr 17 20:50:16.751693 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.751645 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.852191 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.852144 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:16.952763 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:16.952672 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:17.053203 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:17.053167 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:17.071656 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.071629 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:50:17.071794 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.071773 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:50:17.153751 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:17.153585 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:17.154685 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.154663 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:50:17.170303 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.170280 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:50:17.172074 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.172044 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 20:45:16 +0000 UTC" deadline="2027-11-22 01:02:22.866639036 +0000 UTC" Apr 17 20:50:17.172074 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.172071 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13996h12m5.694571477s" Apr 17 20:50:17.183398 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:17.183348 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b8513526e5cd2b3c1dca895a37fc635.slice/crio-43fd542ead18bcce14e72b227bf9ba0b3dac05149fcc8b27cc65d8508fbed11d WatchSource:0}: Error finding container 43fd542ead18bcce14e72b227bf9ba0b3dac05149fcc8b27cc65d8508fbed11d: Status 404 returned error can't find the container with id 43fd542ead18bcce14e72b227bf9ba0b3dac05149fcc8b27cc65d8508fbed11d Apr 17 20:50:17.183849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:17.183830 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc59ba66ee4d7ecaead53b509d5e7b5e5.slice/crio-02f6d7207b6c187585d7ec96d39af401bf105d2a7f597cf29a1dde2d37f65cd9 WatchSource:0}: Error finding container 02f6d7207b6c187585d7ec96d39af401bf105d2a7f597cf29a1dde2d37f65cd9: Status 404 returned error can't find the container with id 02f6d7207b6c187585d7ec96d39af401bf105d2a7f597cf29a1dde2d37f65cd9 Apr 17 20:50:17.188203 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.188185 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:50:17.190935 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.190917 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-km4mq" Apr 17 20:50:17.196685 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.196668 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-km4mq" Apr 17 20:50:17.237765 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.237641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" event={"ID":"3b8513526e5cd2b3c1dca895a37fc635","Type":"ContainerStarted","Data":"43fd542ead18bcce14e72b227bf9ba0b3dac05149fcc8b27cc65d8508fbed11d"} Apr 17 20:50:17.238656 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.238634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" event={"ID":"c59ba66ee4d7ecaead53b509d5e7b5e5","Type":"ContainerStarted","Data":"02f6d7207b6c187585d7ec96d39af401bf105d2a7f597cf29a1dde2d37f65cd9"} Apr 17 20:50:17.254811 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:17.254788 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:17.355259 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:17.355227 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-66.ec2.internal\" not found" Apr 17 20:50:17.420516 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.420476 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:17.447134 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.447106 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:17.456006 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.455982 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" Apr 17 20:50:17.467656 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.467634 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:50:17.469148 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.469134 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" Apr 17 20:50:17.472950 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.472936 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:17.477217 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:17.477197 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:50:18.136823 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.136788 2576 apiserver.go:52] "Watching apiserver" Apr 17 20:50:18.144463 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.144436 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:50:18.144838 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.144812 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rv6lj","openshift-multus/multus-cj26q","openshift-multus/network-metrics-daemon-cxq8r","openshift-network-diagnostics/network-check-target-rjt2l","kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal","openshift-image-registry/node-ca-f2pm8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal","openshift-network-operator/iptables-alerter-kzmpd","openshift-ovn-kubernetes/ovnkube-node-5vzmr","kube-system/konnectivity-agent-gd4dw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn","openshift-cluster-node-tuning-operator/tuned-jhhqq","openshift-dns/node-resolver-qcw6q"] Apr 17 20:50:18.147836 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.147817 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:18.150094 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.150070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.150196 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.150121 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:50:18.150196 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.150157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:50:18.150314 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.150299 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bxtgt\"" Apr 17 20:50:18.152384 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.152213 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:18.152384 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.152283 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:18.152384 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.152312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-w29pc\"" Apr 17 20:50:18.152591 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.152461 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:50:18.152591 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.152573 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:50:18.152689 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.152628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:50:18.152924 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.152908 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:50:18.154491 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.154471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:18.154583 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.154533 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:18.157919 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.157900 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.160528 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.160508 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-scxbk\"" Apr 17 20:50:18.160614 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.160566 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:50:18.160702 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.160511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:50:18.160755 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.160511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:50:18.163932 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.163916 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.164069 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.164049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.165758 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7622l\" (UniqueName: \"kubernetes.io/projected/fc1a4043-1274-42fb-ade0-e46458e332ce-kube-api-access-7622l\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.165860 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf8a6449-19c6-469b-9f9c-049cc3f220b8-serviceca\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.165860 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/10783c15-a601-4d72-90a5-870dc70d9889-konnectivity-ca\") pod \"konnectivity-agent-gd4dw\" (UID: \"10783c15-a601-4d72-90a5-870dc70d9889\") " pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:18.165860 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-cnibin\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.165860 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmwp8\" (UniqueName: \"kubernetes.io/projected/b1da9568-78d7-4d7f-93b4-33b608a48c41-kube-api-access-mmwp8\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:18.165860 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:18.166107 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf8a6449-19c6-469b-9f9c-049cc3f220b8-host\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.166107 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjm5\" (UniqueName: \"kubernetes.io/projected/bf8a6449-19c6-469b-9f9c-049cc3f220b8-kube-api-access-fpjm5\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.166107 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.165912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/10783c15-a601-4d72-90a5-870dc70d9889-agent-certs\") pod \"konnectivity-agent-gd4dw\" (UID: \"10783c15-a601-4d72-90a5-870dc70d9889\") " pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:18.166107 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-cni-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166107 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-os-release\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc1a4043-1274-42fb-ade0-e46458e332ce-cni-binary-copy\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-socket-dir-parent\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-k8s-cni-cncf-io\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-cni-bin\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166207 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-kubelet\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-cni-multus\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-hostroot\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166319 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-conf-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166649 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-daemon-config\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166649 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-multus-certs\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166649 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-etc-kubernetes\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166649 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:18.166649 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-system-cni-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166649 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-netns\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.166818 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166676 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:50:18.166818 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166688 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2gpzj\"" Apr 17 20:50:18.166818 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.166707 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:50:18.167844 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.167804 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:50:18.167954 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.167854 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:50:18.167954 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.167870 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:50:18.167954 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.167929 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:50:18.168101 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.167854 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hkpd2\"" Apr 17 20:50:18.168101 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.168054 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:50:18.168193 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.168124 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:50:18.168651 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.168623 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.168745 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.168704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.170807 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.170790 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:50:18.170955 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.170936 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ggbzf\"" Apr 17 20:50:18.171059 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.171036 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:50:18.171120 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.171069 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:50:18.171120 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.171086 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:50:18.171219 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.171160 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7qbl7\"" Apr 17 20:50:18.171263 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.171221 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:50:18.171347 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.171332 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.173387 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.173350 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:50:18.173864 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.173516 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bv6n4\"" Apr 17 20:50:18.173864 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.173519 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:50:18.173864 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.173659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.175598 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.175577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:50:18.175789 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.175753 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cf6db\"" Apr 17 20:50:18.175789 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.175774 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:50:18.197309 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.197279 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:45:17 +0000 UTC" deadline="2027-09-21 00:25:27.857014199 +0000 UTC" Apr 17 20:50:18.197309 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.197307 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12507h35m9.659710322s" Apr 17 20:50:18.256150 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.256117 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:50:18.266740 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-cni-multus\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.266740 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysctl-conf\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.266958 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-run\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.266958 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-host\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.266958 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-cni-multus\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.266958 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-iptables-alerter-script\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.266958 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-system-cni-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.266958 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf8a6449-19c6-469b-9f9c-049cc3f220b8-serviceca\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.266958 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20502fc4-4670-4b18-827e-d706c996ef2f-tmp\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.266978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-kubelet\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-system-cni-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-run-netns\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-ovn\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-cni-binary-copy\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-cni-bin\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovnkube-script-lib\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc1a4043-1274-42fb-ade0-e46458e332ce-cni-binary-copy\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-conf-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267407 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf8a6449-19c6-469b-9f9c-049cc3f220b8-serviceca\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysctl-d\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-conf-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-host-slash\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-systemd-units\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-slash\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-cni-netd\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-registration-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-daemon-config\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-sys\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-os-release\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.267732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7622l\" (UniqueName: \"kubernetes.io/projected/fc1a4043-1274-42fb-ade0-e46458e332ce-kube-api-access-7622l\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-cnibin\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf8a6449-19c6-469b-9f9c-049cc3f220b8-host\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/10783c15-a601-4d72-90a5-870dc70d9889-agent-certs\") pod \"konnectivity-agent-gd4dw\" (UID: \"10783c15-a601-4d72-90a5-870dc70d9889\") " pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-cnibin\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysconfig\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf8a6449-19c6-469b-9f9c-049cc3f220b8-host\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-node-log\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc1a4043-1274-42fb-ade0-e46458e332ce-cni-binary-copy\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjm5\" (UniqueName: \"kubernetes.io/projected/bf8a6449-19c6-469b-9f9c-049cc3f220b8-kube-api-access-fpjm5\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-cni-bin\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.267980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-modprobe-d\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f96c4ba0-6cee-4727-bef7-248a0da4b215-hosts-file\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-etc-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/10783c15-a601-4d72-90a5-870dc70d9889-konnectivity-ca\") pod \"konnectivity-agent-gd4dw\" (UID: \"10783c15-a601-4d72-90a5-870dc70d9889\") " pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-systemd\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.268194 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268119 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-lib-modules\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-cni-bin\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268175 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-systemd\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-var-lib-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-daemon-config\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-kubernetes\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.268253 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-var-lib-kubelet\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f96c4ba0-6cee-4727-bef7-248a0da4b215-tmp-dir\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-cnibin\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.268396 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:18.768331911 +0000 UTC m=+3.002798329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-device-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-cni-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-os-release\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-os-release\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.268916 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-socket-dir-parent\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-cni-dir\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-hostroot\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-multus-socket-dir-parent\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-etc-kubernetes\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-hostroot\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-env-overrides\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-etc-kubernetes\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/10783c15-a601-4d72-90a5-870dc70d9889-konnectivity-ca\") pod \"konnectivity-agent-gd4dw\" (UID: \"10783c15-a601-4d72-90a5-870dc70d9889\") " pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9kxs\" (UniqueName: \"kubernetes.io/projected/1876416b-79dd-4ff1-88a4-b7111c5e304d-kube-api-access-p9kxs\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpn5\" (UniqueName: \"kubernetes.io/projected/f96c4ba0-6cee-4727-bef7-248a0da4b215-kube-api-access-4cpn5\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8c7s\" (UniqueName: \"kubernetes.io/projected/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-kube-api-access-g8c7s\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-etc-selinux\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-sys-fs\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-netns\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.269640 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/20502fc4-4670-4b18-827e-d706c996ef2f-etc-tuned\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-log-socket\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-netns\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.268978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovnkube-config\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmwp8\" (UniqueName: \"kubernetes.io/projected/b1da9568-78d7-4d7f-93b4-33b608a48c41-kube-api-access-mmwp8\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhcg\" (UniqueName: \"kubernetes.io/projected/20502fc4-4670-4b18-827e-d706c996ef2f-kube-api-access-2bhcg\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovn-node-metrics-cert\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-socket-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqzk7\" (UniqueName: \"kubernetes.io/projected/07c94259-a674-49df-9a32-3cd6d1482c4e-kube-api-access-rqzk7\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-k8s-cni-cncf-io\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-kubelet\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-multus-certs\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-k8s-cni-cncf-io\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmj66\" (UniqueName: \"kubernetes.io/projected/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-kube-api-access-tmj66\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-system-cni-dir\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-var-lib-kubelet\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.270252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.269380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc1a4043-1274-42fb-ade0-e46458e332ce-host-run-multus-certs\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.272800 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.272780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/10783c15-a601-4d72-90a5-870dc70d9889-agent-certs\") pod \"konnectivity-agent-gd4dw\" (UID: \"10783c15-a601-4d72-90a5-870dc70d9889\") " pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:18.273242 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.273226 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:18.273330 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.273248 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:18.273330 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.273261 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9dc82 for pod openshift-network-diagnostics/network-check-target-rjt2l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:18.273465 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.273330 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82 podName:9e571f60-0b76-435b-aac4-aada6990b2b3 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:18.773314095 +0000 UTC m=+3.007780483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9dc82" (UniqueName: "kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82") pod "network-check-target-rjt2l" (UID: "9e571f60-0b76-435b-aac4-aada6990b2b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:18.277441 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.277421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjm5\" (UniqueName: \"kubernetes.io/projected/bf8a6449-19c6-469b-9f9c-049cc3f220b8-kube-api-access-fpjm5\") pod \"node-ca-f2pm8\" (UID: \"bf8a6449-19c6-469b-9f9c-049cc3f220b8\") " pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.277712 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.277657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7622l\" (UniqueName: \"kubernetes.io/projected/fc1a4043-1274-42fb-ade0-e46458e332ce-kube-api-access-7622l\") pod \"multus-cj26q\" (UID: \"fc1a4043-1274-42fb-ade0-e46458e332ce\") " pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.281620 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.281593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmwp8\" (UniqueName: \"kubernetes.io/projected/b1da9568-78d7-4d7f-93b4-33b608a48c41-kube-api-access-mmwp8\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:18.326615 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.326581 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:18.369923 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.369890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-systemd\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.370093 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.369934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-lib-modules\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.370093 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.369957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-systemd\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.370093 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.369981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-var-lib-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.370093 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-systemd\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.370093 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-systemd\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.370093 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-kubernetes\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-var-lib-kubelet\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-lib-modules\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f96c4ba0-6cee-4727-bef7-248a0da4b215-tmp-dir\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-kubernetes\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-cnibin\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-var-lib-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-device-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-env-overrides\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-cnibin\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9kxs\" (UniqueName: \"kubernetes.io/projected/1876416b-79dd-4ff1-88a4-b7111c5e304d-kube-api-access-p9kxs\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cpn5\" (UniqueName: \"kubernetes.io/projected/f96c4ba0-6cee-4727-bef7-248a0da4b215-kube-api-access-4cpn5\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8c7s\" (UniqueName: \"kubernetes.io/projected/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-kube-api-access-g8c7s\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-var-lib-kubelet\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.370401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-etc-selinux\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-sys-fs\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/20502fc4-4670-4b18-827e-d706c996ef2f-etc-tuned\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-log-socket\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovnkube-config\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f96c4ba0-6cee-4727-bef7-248a0da4b215-tmp-dir\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhcg\" (UniqueName: \"kubernetes.io/projected/20502fc4-4670-4b18-827e-d706c996ef2f-kube-api-access-2bhcg\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovn-node-metrics-cert\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-device-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-socket-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqzk7\" (UniqueName: \"kubernetes.io/projected/07c94259-a674-49df-9a32-3cd6d1482c4e-kube-api-access-rqzk7\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmj66\" (UniqueName: \"kubernetes.io/projected/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-kube-api-access-tmj66\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-system-cni-dir\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysctl-conf\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-run\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-host\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.371171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-iptables-alerter-script\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20502fc4-4670-4b18-827e-d706c996ef2f-tmp\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-kubelet\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-run-netns\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-ovn\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-cni-binary-copy\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-cni-bin\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovnkube-script-lib\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-env-overrides\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370934 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-run\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.370974 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-host\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-socket-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.371987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-log-socket\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-system-cni-dir\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-cni-bin\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysctl-conf\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysctl-d\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-host-slash\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-iptables-alerter-script\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-systemd-units\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-slash\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-cni-netd\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.371989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-registration-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-sys\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-os-release\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-sys-fs\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysctl-d\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-host-slash\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.372940 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-systemd-units\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-slash\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-cni-netd\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-registration-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-sys\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-os-release\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysconfig\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-node-log\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovnkube-script-lib\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-modprobe-d\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f96c4ba0-6cee-4727-bef7-248a0da4b215-hosts-file\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f96c4ba0-6cee-4727-bef7-248a0da4b215-hosts-file\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-etc-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-etc-openvswitch\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.372979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07c94259-a674-49df-9a32-3cd6d1482c4e-etc-selinux\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.373739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-sysconfig\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-node-log\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373193 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1876416b-79dd-4ff1-88a4-b7111c5e304d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/20502fc4-4670-4b18-827e-d706c996ef2f-etc-modprobe-d\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-run-ovn\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-kubelet\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-host-run-netns\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/20502fc4-4670-4b18-827e-d706c996ef2f-etc-tuned\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1876416b-79dd-4ff1-88a4-b7111c5e304d-cni-binary-copy\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.373951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovnkube-config\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.374332 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.374067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20502fc4-4670-4b18-827e-d706c996ef2f-tmp\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.375075 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.375051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-ovn-node-metrics-cert\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.398917 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.398801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8c7s\" (UniqueName: \"kubernetes.io/projected/82f6c12a-75ed-42b7-8c6c-bc314957ec1f-kube-api-access-g8c7s\") pod \"ovnkube-node-5vzmr\" (UID: \"82f6c12a-75ed-42b7-8c6c-bc314957ec1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.399266 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.399243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhcg\" (UniqueName: \"kubernetes.io/projected/20502fc4-4670-4b18-827e-d706c996ef2f-kube-api-access-2bhcg\") pod \"tuned-jhhqq\" (UID: \"20502fc4-4670-4b18-827e-d706c996ef2f\") " pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.399788 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.399764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqzk7\" (UniqueName: \"kubernetes.io/projected/07c94259-a674-49df-9a32-3cd6d1482c4e-kube-api-access-rqzk7\") pod \"aws-ebs-csi-driver-node-rlsvn\" (UID: \"07c94259-a674-49df-9a32-3cd6d1482c4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.400404 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.400385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cpn5\" (UniqueName: \"kubernetes.io/projected/f96c4ba0-6cee-4727-bef7-248a0da4b215-kube-api-access-4cpn5\") pod \"node-resolver-qcw6q\" (UID: \"f96c4ba0-6cee-4727-bef7-248a0da4b215\") " pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.400649 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.400631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9kxs\" (UniqueName: \"kubernetes.io/projected/1876416b-79dd-4ff1-88a4-b7111c5e304d-kube-api-access-p9kxs\") pod \"multus-additional-cni-plugins-rv6lj\" (UID: \"1876416b-79dd-4ff1-88a4-b7111c5e304d\") " pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.400857 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.400836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmj66\" (UniqueName: \"kubernetes.io/projected/b6d46ce2-d6de-472b-86f9-8e1d10a1c269-kube-api-access-tmj66\") pod \"iptables-alerter-kzmpd\" (UID: \"b6d46ce2-d6de-472b-86f9-8e1d10a1c269\") " pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.459902 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.459864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:18.468758 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.468727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cj26q" Apr 17 20:50:18.476456 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.476433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f2pm8" Apr 17 20:50:18.484055 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.484034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kzmpd" Apr 17 20:50:18.489701 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.489666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:18.495349 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.495330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" Apr 17 20:50:18.501934 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.501914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" Apr 17 20:50:18.510487 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.510468 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" Apr 17 20:50:18.517073 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.517055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qcw6q" Apr 17 20:50:18.775743 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.775656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:18.775743 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:18.775715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:18.775969 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.775836 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:18.775969 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.775843 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:18.775969 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.775858 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:18.775969 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.775870 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9dc82 for pod openshift-network-diagnostics/network-check-target-rjt2l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:18.775969 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.775915 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82 podName:9e571f60-0b76-435b-aac4-aada6990b2b3 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:19.775895948 +0000 UTC m=+4.010362335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9dc82" (UniqueName: "kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82") pod "network-check-target-rjt2l" (UID: "9e571f60-0b76-435b-aac4-aada6990b2b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:18.775969 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:18.775935 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:19.775925388 +0000 UTC m=+4.010391774 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:18.879632 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.879603 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1876416b_79dd_4ff1_88a4_b7111c5e304d.slice/crio-1de985b12ff5f84c93d21cf661f3d3f64a05fecbf9dc874af6ef230d2b0c7372 WatchSource:0}: Error finding container 1de985b12ff5f84c93d21cf661f3d3f64a05fecbf9dc874af6ef230d2b0c7372: Status 404 returned error can't find the container with id 1de985b12ff5f84c93d21cf661f3d3f64a05fecbf9dc874af6ef230d2b0c7372 Apr 17 20:50:18.880568 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.880543 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10783c15_a601_4d72_90a5_870dc70d9889.slice/crio-1115d79d734bd2d6076093177a6a17db89b1864476fb8c66d9c640ea6629a71c WatchSource:0}: Error finding container 1115d79d734bd2d6076093177a6a17db89b1864476fb8c66d9c640ea6629a71c: Status 404 returned error can't find the container with id 1115d79d734bd2d6076093177a6a17db89b1864476fb8c66d9c640ea6629a71c Apr 17 20:50:18.882149 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.882124 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20502fc4_4670_4b18_827e_d706c996ef2f.slice/crio-f7d12e71b841a275f4fe9c9103a2f2fe6019aa754450d9ba0f25a0ddb1db8220 WatchSource:0}: Error finding container f7d12e71b841a275f4fe9c9103a2f2fe6019aa754450d9ba0f25a0ddb1db8220: Status 404 returned error can't find the container with id f7d12e71b841a275f4fe9c9103a2f2fe6019aa754450d9ba0f25a0ddb1db8220 Apr 17 20:50:18.882900 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.882874 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc1a4043_1274_42fb_ade0_e46458e332ce.slice/crio-ecc55e48205233bcad52e3113b67cdb79cc4f6cce9c66459191f03abd98f4111 WatchSource:0}: Error finding container ecc55e48205233bcad52e3113b67cdb79cc4f6cce9c66459191f03abd98f4111: Status 404 returned error can't find the container with id ecc55e48205233bcad52e3113b67cdb79cc4f6cce9c66459191f03abd98f4111 Apr 17 20:50:18.883685 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.883615 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf96c4ba0_6cee_4727_bef7_248a0da4b215.slice/crio-e20edf4fac7f4c1e6f8533191c052166b22ce1feaa83382755b1decf0a1473e3 WatchSource:0}: Error finding container e20edf4fac7f4c1e6f8533191c052166b22ce1feaa83382755b1decf0a1473e3: Status 404 returned error can't find the container with id e20edf4fac7f4c1e6f8533191c052166b22ce1feaa83382755b1decf0a1473e3 Apr 17 20:50:18.884709 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.884420 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f6c12a_75ed_42b7_8c6c_bc314957ec1f.slice/crio-8fc597fb63d86f99b2e9dd86c8cdd4515eb9dabbd176dac5285d607f8303b82d WatchSource:0}: Error finding container 8fc597fb63d86f99b2e9dd86c8cdd4515eb9dabbd176dac5285d607f8303b82d: Status 404 returned error can't find the container with id 8fc597fb63d86f99b2e9dd86c8cdd4515eb9dabbd176dac5285d607f8303b82d Apr 17 20:50:18.886438 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.886412 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6d46ce2_d6de_472b_86f9_8e1d10a1c269.slice/crio-5fcbc07b7bb03a0e6f6417c91ea90cb0bc513b370daf4b73d9f6b1c03627aedb WatchSource:0}: Error finding container 5fcbc07b7bb03a0e6f6417c91ea90cb0bc513b370daf4b73d9f6b1c03627aedb: Status 404 returned error can't find the container with id 5fcbc07b7bb03a0e6f6417c91ea90cb0bc513b370daf4b73d9f6b1c03627aedb Apr 17 20:50:18.889839 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.889187 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf8a6449_19c6_469b_9f9c_049cc3f220b8.slice/crio-ac701ab7b58e74021b96dc37cc2c08cf9a765ddd7ec673e3e28231af3507ea22 WatchSource:0}: Error finding container ac701ab7b58e74021b96dc37cc2c08cf9a765ddd7ec673e3e28231af3507ea22: Status 404 returned error can't find the container with id ac701ab7b58e74021b96dc37cc2c08cf9a765ddd7ec673e3e28231af3507ea22 Apr 17 20:50:18.890526 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:18.890498 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07c94259_a674_49df_9a32_3cd6d1482c4e.slice/crio-6b289b725e9e81a2a06e00b93dfa320ad1b87e909648a2279708d657f9da592d WatchSource:0}: Error finding container 6b289b725e9e81a2a06e00b93dfa320ad1b87e909648a2279708d657f9da592d: Status 404 returned error can't find the container with id 6b289b725e9e81a2a06e00b93dfa320ad1b87e909648a2279708d657f9da592d Apr 17 20:50:19.197572 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.197471 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:45:17 +0000 UTC" deadline="2027-11-02 21:40:55.407752307 +0000 UTC" Apr 17 20:50:19.197572 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.197514 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13536h50m36.210242344s" Apr 17 20:50:19.243814 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.243771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerStarted","Data":"1de985b12ff5f84c93d21cf661f3d3f64a05fecbf9dc874af6ef230d2b0c7372"} Apr 17 20:50:19.245501 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.245468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" event={"ID":"07c94259-a674-49df-9a32-3cd6d1482c4e","Type":"ContainerStarted","Data":"6b289b725e9e81a2a06e00b93dfa320ad1b87e909648a2279708d657f9da592d"} Apr 17 20:50:19.246557 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.246527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f2pm8" event={"ID":"bf8a6449-19c6-469b-9f9c-049cc3f220b8","Type":"ContainerStarted","Data":"ac701ab7b58e74021b96dc37cc2c08cf9a765ddd7ec673e3e28231af3507ea22"} Apr 17 20:50:19.247908 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.247878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qcw6q" event={"ID":"f96c4ba0-6cee-4727-bef7-248a0da4b215","Type":"ContainerStarted","Data":"e20edf4fac7f4c1e6f8533191c052166b22ce1feaa83382755b1decf0a1473e3"} Apr 17 20:50:19.249459 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.249439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" event={"ID":"20502fc4-4670-4b18-827e-d706c996ef2f","Type":"ContainerStarted","Data":"f7d12e71b841a275f4fe9c9103a2f2fe6019aa754450d9ba0f25a0ddb1db8220"} Apr 17 20:50:19.251274 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.251255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" event={"ID":"3b8513526e5cd2b3c1dca895a37fc635","Type":"ContainerStarted","Data":"8893f94f54c2eb6378a5668e9fc261f83ab11ab491a10ee6d1257b2de3cc3b18"} Apr 17 20:50:19.252543 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.252521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kzmpd" event={"ID":"b6d46ce2-d6de-472b-86f9-8e1d10a1c269","Type":"ContainerStarted","Data":"5fcbc07b7bb03a0e6f6417c91ea90cb0bc513b370daf4b73d9f6b1c03627aedb"} Apr 17 20:50:19.253547 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.253524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"8fc597fb63d86f99b2e9dd86c8cdd4515eb9dabbd176dac5285d607f8303b82d"} Apr 17 20:50:19.255044 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.255017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gd4dw" event={"ID":"10783c15-a601-4d72-90a5-870dc70d9889","Type":"ContainerStarted","Data":"1115d79d734bd2d6076093177a6a17db89b1864476fb8c66d9c640ea6629a71c"} Apr 17 20:50:19.259685 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.259634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cj26q" event={"ID":"fc1a4043-1274-42fb-ade0-e46458e332ce","Type":"ContainerStarted","Data":"ecc55e48205233bcad52e3113b67cdb79cc4f6cce9c66459191f03abd98f4111"} Apr 17 20:50:19.264556 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.264493 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-66.ec2.internal" podStartSLOduration=2.264479027 podStartE2EDuration="2.264479027s" podCreationTimestamp="2026-04-17 20:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:50:19.264418996 +0000 UTC m=+3.498885406" watchObservedRunningTime="2026-04-17 20:50:19.264479027 +0000 UTC m=+3.498945436" Apr 17 20:50:19.785573 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.785530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:19.785764 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:19.785628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:19.785851 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:19.785769 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:19.785851 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:19.785787 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:19.785851 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:19.785799 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9dc82 for pod openshift-network-diagnostics/network-check-target-rjt2l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:19.786008 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:19.785860 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82 podName:9e571f60-0b76-435b-aac4-aada6990b2b3 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:21.785841542 +0000 UTC m=+6.020307950 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9dc82" (UniqueName: "kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82") pod "network-check-target-rjt2l" (UID: "9e571f60-0b76-435b-aac4-aada6990b2b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:19.786008 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:19.785933 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:19.786008 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:19.785969 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:21.785956897 +0000 UTC m=+6.020423295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:20.238274 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:20.238192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:20.238719 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:20.238314 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:20.238783 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:20.238761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:20.238892 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:20.238868 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:20.273154 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:20.273120 2576 generic.go:358] "Generic (PLEG): container finished" podID="c59ba66ee4d7ecaead53b509d5e7b5e5" containerID="370d176ec2329cbb54b60315ce15910969190dc0590e07e31f7bde4082caf6c1" exitCode=0 Apr 17 20:50:20.274113 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:20.274085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" event={"ID":"c59ba66ee4d7ecaead53b509d5e7b5e5","Type":"ContainerDied","Data":"370d176ec2329cbb54b60315ce15910969190dc0590e07e31f7bde4082caf6c1"} Apr 17 20:50:21.280071 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:21.280032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" event={"ID":"c59ba66ee4d7ecaead53b509d5e7b5e5","Type":"ContainerStarted","Data":"eb81da5e425e6ad7c58f88cc8e235fe76bac81a043e8edf5f2f6f954839a4f1a"} Apr 17 20:50:21.801885 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:21.801843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:21.802092 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:21.801915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:21.802092 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:21.802081 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:21.802394 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:21.802151 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:25.802130356 +0000 UTC m=+10.036596749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:21.802508 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:21.802489 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:21.802586 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:21.802512 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:21.802586 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:21.802522 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9dc82 for pod openshift-network-diagnostics/network-check-target-rjt2l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:21.802586 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:21.802563 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82 podName:9e571f60-0b76-435b-aac4-aada6990b2b3 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:25.802551505 +0000 UTC m=+10.037017889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9dc82" (UniqueName: "kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82") pod "network-check-target-rjt2l" (UID: "9e571f60-0b76-435b-aac4-aada6990b2b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:22.235412 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:22.235322 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:22.235571 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:22.235431 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:22.235891 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:22.235867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:22.236004 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:22.235983 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:24.236195 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:24.236160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:24.236656 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:24.236302 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:24.236656 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:24.236160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:24.236745 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:24.236719 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:25.837327 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:25.837272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:25.837777 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:25.837342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:25.837777 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:25.837484 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:25.837777 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:25.837483 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:25.837777 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:25.837517 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:25.837777 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:25.837530 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9dc82 for pod openshift-network-diagnostics/network-check-target-rjt2l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:25.837777 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:25.837546 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:33.837526363 +0000 UTC m=+18.071992756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:25.837777 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:25.837582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82 podName:9e571f60-0b76-435b-aac4-aada6990b2b3 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:33.837565993 +0000 UTC m=+18.072032392 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9dc82" (UniqueName: "kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82") pod "network-check-target-rjt2l" (UID: "9e571f60-0b76-435b-aac4-aada6990b2b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:26.236974 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:26.236904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:26.237125 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:26.237004 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:26.237445 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:26.237424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:26.237572 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:26.237528 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:28.236293 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:28.235373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:28.236293 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:28.235744 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:28.236293 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:28.235410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:28.236293 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:28.236210 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:30.235774 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:30.235737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:30.236241 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:30.235879 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:30.236241 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:30.235945 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:30.236241 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:30.236072 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:32.235701 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:32.235669 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:32.236168 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:32.235808 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:32.236168 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:32.235870 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:32.236168 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:32.235985 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:33.897790 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:33.897743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:33.898323 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:33.897827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:33.898323 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:33.897893 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:33.898323 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:33.897953 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:33.898323 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:33.897970 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:33.898323 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:33.897982 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9dc82 for pod openshift-network-diagnostics/network-check-target-rjt2l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:33.898323 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:33.897996 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:49.897972479 +0000 UTC m=+34.132438867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:33.898323 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:33.898020 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82 podName:9e571f60-0b76-435b-aac4-aada6990b2b3 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:49.898009394 +0000 UTC m=+34.132475795 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9dc82" (UniqueName: "kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82") pod "network-check-target-rjt2l" (UID: "9e571f60-0b76-435b-aac4-aada6990b2b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:34.235528 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:34.235446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:34.235691 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:34.235451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:34.235691 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:34.235573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:34.235691 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:34.235668 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:36.241881 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.237998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:36.241881 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:36.238137 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:36.241881 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.238482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:36.241881 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:36.238568 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:36.314402 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.314126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerStarted","Data":"3beb33d6b8ccdba8725311e66ee2e210a14e519f325ffa8d1025d4fe4b5d5a7c"} Apr 17 20:50:36.315746 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.315721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" event={"ID":"07c94259-a674-49df-9a32-3cd6d1482c4e","Type":"ContainerStarted","Data":"44332b1ab7c87ddc7355c85c170afc8588d9fd043da6935d163d9dab9e18b9d2"} Apr 17 20:50:36.317050 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.317025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f2pm8" event={"ID":"bf8a6449-19c6-469b-9f9c-049cc3f220b8","Type":"ContainerStarted","Data":"5904a18bb775b13ae5f76716fd3cc9c62a69105fc006413f472ada08609f725a"} Apr 17 20:50:36.320069 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.320048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" event={"ID":"20502fc4-4670-4b18-827e-d706c996ef2f","Type":"ContainerStarted","Data":"801a5c99af680d016b52eb2aefd93d0733362eff6e46f8cefec44a6791f8cd1b"} Apr 17 20:50:36.322906 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.322800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gd4dw" event={"ID":"10783c15-a601-4d72-90a5-870dc70d9889","Type":"ContainerStarted","Data":"865ea88efb09cd9d2d167eb5cba1a1b9264c2c4b39e4e3918850e2282055c873"} Apr 17 20:50:36.324427 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.324346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cj26q" event={"ID":"fc1a4043-1274-42fb-ade0-e46458e332ce","Type":"ContainerStarted","Data":"1a478882c05505d2242d2b84b66987325812d53723c39ab0b1b04c0fde80b40c"} Apr 17 20:50:36.334456 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.334418 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-66.ec2.internal" podStartSLOduration=19.334407306 podStartE2EDuration="19.334407306s" podCreationTimestamp="2026-04-17 20:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:50:21.293555756 +0000 UTC m=+5.528022164" watchObservedRunningTime="2026-04-17 20:50:36.334407306 +0000 UTC m=+20.568873714" Apr 17 20:50:36.345283 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.345245 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cj26q" podStartSLOduration=3.22426733 podStartE2EDuration="20.345231777s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.88560658 +0000 UTC m=+3.120072970" lastFinishedPulling="2026-04-17 20:50:36.00657102 +0000 UTC m=+20.241037417" observedRunningTime="2026-04-17 20:50:36.344811145 +0000 UTC m=+20.579277552" watchObservedRunningTime="2026-04-17 20:50:36.345231777 +0000 UTC m=+20.579698183" Apr 17 20:50:36.355790 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.355754 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gd4dw" podStartSLOduration=3.274133671 podStartE2EDuration="20.355743085s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.882233642 +0000 UTC m=+3.116700027" lastFinishedPulling="2026-04-17 20:50:35.963843049 +0000 UTC m=+20.198309441" observedRunningTime="2026-04-17 20:50:36.355394915 +0000 UTC m=+20.589861326" watchObservedRunningTime="2026-04-17 20:50:36.355743085 +0000 UTC m=+20.590209492" Apr 17 20:50:36.367081 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.367035 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-f2pm8" podStartSLOduration=7.75087763 podStartE2EDuration="20.367021707s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.89226331 +0000 UTC m=+3.126729700" lastFinishedPulling="2026-04-17 20:50:31.508407376 +0000 UTC m=+15.742873777" observedRunningTime="2026-04-17 20:50:36.366830938 +0000 UTC m=+20.601297345" watchObservedRunningTime="2026-04-17 20:50:36.367021707 +0000 UTC m=+20.601488114" Apr 17 20:50:36.380973 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:36.379825 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jhhqq" podStartSLOduration=3.297599783 podStartE2EDuration="20.379809435s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.883906049 +0000 UTC m=+3.118372436" lastFinishedPulling="2026-04-17 20:50:35.96611569 +0000 UTC m=+20.200582088" observedRunningTime="2026-04-17 20:50:36.378557743 +0000 UTC m=+20.613024150" watchObservedRunningTime="2026-04-17 20:50:36.379809435 +0000 UTC m=+20.614275842" Apr 17 20:50:37.330164 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.330145 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 20:50:37.330506 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.330468 2576 generic.go:358] "Generic (PLEG): container finished" podID="82f6c12a-75ed-42b7-8c6c-bc314957ec1f" containerID="1bebc0e135bdea85c6ed55514c29af2dbf10a272540e8e8760de09c564b288be" exitCode=1 Apr 17 20:50:37.330571 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.330542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"f7a7d2cf77aa56984879b1d7b229321e1c580a66e9bc2a244079cd5edfe06e30"} Apr 17 20:50:37.330646 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.330586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"a218be4cae2cd2640c1b7508b206da5aa3cb6ec3152a8dc6ba18bc2fa548ddf7"} Apr 17 20:50:37.330646 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.330611 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"3a7d83a3807ac762612a58cacd8419c2e606aebf8a98a46b30665bc5a2638a29"} Apr 17 20:50:37.330733 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.330623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"d184679c84819d94b7c9f6803ac65618f38d400d0d49bfb26a020bfcff641519"} Apr 17 20:50:37.330733 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.330669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"90798afe70fa28da64f227e1bb61da21269e29044beeb185e41ab78898b5aef7"} Apr 17 20:50:37.330733 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.330683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerDied","Data":"1bebc0e135bdea85c6ed55514c29af2dbf10a272540e8e8760de09c564b288be"} Apr 17 20:50:37.331723 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.331701 2576 generic.go:358] "Generic (PLEG): container finished" podID="1876416b-79dd-4ff1-88a4-b7111c5e304d" containerID="3beb33d6b8ccdba8725311e66ee2e210a14e519f325ffa8d1025d4fe4b5d5a7c" exitCode=0 Apr 17 20:50:37.331816 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.331767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerDied","Data":"3beb33d6b8ccdba8725311e66ee2e210a14e519f325ffa8d1025d4fe4b5d5a7c"} Apr 17 20:50:37.333009 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.332985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qcw6q" event={"ID":"f96c4ba0-6cee-4727-bef7-248a0da4b215","Type":"ContainerStarted","Data":"791e2ea17c8e0fce79c08795ba273f4100f907f0c9e88df44f486095e4260b2a"} Apr 17 20:50:37.358744 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.358697 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qcw6q" podStartSLOduration=4.282664247 podStartE2EDuration="21.358686235s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.887833968 +0000 UTC m=+3.122300353" lastFinishedPulling="2026-04-17 20:50:35.96385594 +0000 UTC m=+20.198322341" observedRunningTime="2026-04-17 20:50:37.358638125 +0000 UTC m=+21.593104528" watchObservedRunningTime="2026-04-17 20:50:37.358686235 +0000 UTC m=+21.593152641" Apr 17 20:50:37.682204 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:37.682173 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:50:38.076997 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.076906 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:38.235388 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.235004 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:50:37.682201844Z","UUID":"ea249c70-c266-46ee-ad3c-557e798186ee","Handler":null,"Name":"","Endpoint":""} Apr 17 20:50:38.235388 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.235211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:38.235388 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:38.235309 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:38.235861 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.235839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:38.235941 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:38.235926 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:38.237785 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.237763 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:50:38.237899 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.237793 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:50:38.337086 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.336989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kzmpd" event={"ID":"b6d46ce2-d6de-472b-86f9-8e1d10a1c269","Type":"ContainerStarted","Data":"ad22359aed41179901991e39b8452ba466db951cfca383a4f72ae0f3c980a93e"} Apr 17 20:50:38.338927 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.338885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" event={"ID":"07c94259-a674-49df-9a32-3cd6d1482c4e","Type":"ContainerStarted","Data":"478a600f671fafdb0c7117a0b74d9fde2f2a95826a4acb59afc74d5c7896d84d"} Apr 17 20:50:38.358818 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.358772 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kzmpd" podStartSLOduration=5.284576178 podStartE2EDuration="22.358758953s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.889684883 +0000 UTC m=+3.124151267" lastFinishedPulling="2026-04-17 20:50:35.963867652 +0000 UTC m=+20.198334042" observedRunningTime="2026-04-17 20:50:38.358629908 +0000 UTC m=+22.593096315" watchObservedRunningTime="2026-04-17 20:50:38.358758953 +0000 UTC m=+22.593225360" Apr 17 20:50:38.709248 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.709159 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-c6jkl"] Apr 17 20:50:38.729643 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.728997 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:38.729643 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:38.729086 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c6jkl" podUID="7e49bff1-efa9-421a-8ee8-f431f0f0c109" Apr 17 20:50:38.833048 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.833009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7e49bff1-efa9-421a-8ee8-f431f0f0c109-kubelet-config\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:38.833048 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.833050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:38.833252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.833083 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7e49bff1-efa9-421a-8ee8-f431f0f0c109-dbus\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:38.934281 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.934247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7e49bff1-efa9-421a-8ee8-f431f0f0c109-kubelet-config\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:38.934281 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.934285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:38.934508 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.934321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7e49bff1-efa9-421a-8ee8-f431f0f0c109-dbus\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:38.934508 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.934409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7e49bff1-efa9-421a-8ee8-f431f0f0c109-kubelet-config\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:38.934508 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:38.934445 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:38.934508 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:38.934510 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret podName:7e49bff1-efa9-421a-8ee8-f431f0f0c109 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:39.434489749 +0000 UTC m=+23.668956136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret") pod "global-pull-secret-syncer-c6jkl" (UID: "7e49bff1-efa9-421a-8ee8-f431f0f0c109") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:38.934686 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:38.934639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7e49bff1-efa9-421a-8ee8-f431f0f0c109-dbus\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:39.342281 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:39.342186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" event={"ID":"07c94259-a674-49df-9a32-3cd6d1482c4e","Type":"ContainerStarted","Data":"13cae0ac8535b4d82f49dc7be945c881a1ff0bcf4d364b83c734c7f1f5df1bfc"} Apr 17 20:50:39.346007 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:39.345977 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 20:50:39.346328 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:39.346308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"d96128a52ed2ef03206ac291a1dee6070d27643b0aa2f2ad4d45adcb9e9b3fb8"} Apr 17 20:50:39.359659 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:39.359609 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rlsvn" podStartSLOduration=3.254146665 podStartE2EDuration="23.359591012s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.893017963 +0000 UTC m=+3.127484349" lastFinishedPulling="2026-04-17 20:50:38.99846231 +0000 UTC m=+23.232928696" observedRunningTime="2026-04-17 20:50:39.359097017 +0000 UTC m=+23.593563425" watchObservedRunningTime="2026-04-17 20:50:39.359591012 +0000 UTC m=+23.594057420" Apr 17 20:50:39.438571 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:39.438534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:39.438772 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:39.438645 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:39.438772 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:39.438711 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret podName:7e49bff1-efa9-421a-8ee8-f431f0f0c109 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:40.438694789 +0000 UTC m=+24.673161174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret") pod "global-pull-secret-syncer-c6jkl" (UID: "7e49bff1-efa9-421a-8ee8-f431f0f0c109") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:40.240919 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:40.240886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:40.241110 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:40.240886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:40.241110 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:40.241005 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:40.241110 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:40.241086 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:40.241110 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:40.240890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:40.241290 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:40.241227 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c6jkl" podUID="7e49bff1-efa9-421a-8ee8-f431f0f0c109" Apr 17 20:50:40.417832 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:40.417794 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:40.418561 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:40.418540 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:40.445527 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:40.445255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:40.445527 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:40.445398 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:40.445527 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:40.445462 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret podName:7e49bff1-efa9-421a-8ee8-f431f0f0c109 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:42.44544542 +0000 UTC m=+26.679911810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret") pod "global-pull-secret-syncer-c6jkl" (UID: "7e49bff1-efa9-421a-8ee8-f431f0f0c109") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:41.350641 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:41.350612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gd4dw" Apr 17 20:50:42.235300 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.235270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:42.235691 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.235270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:42.235691 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:42.235446 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:42.235691 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:42.235500 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c6jkl" podUID="7e49bff1-efa9-421a-8ee8-f431f0f0c109" Apr 17 20:50:42.235691 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.235523 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:42.235691 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:42.235629 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:42.354935 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.354907 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 20:50:42.355386 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.355337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"24798d6ca0371e09d7be6a0a3478d12d3ca5e8de7b9e9853284456ccd853b5a9"} Apr 17 20:50:42.355822 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.355775 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:42.355822 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.355806 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:42.355948 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.355932 2576 scope.go:117] "RemoveContainer" containerID="1bebc0e135bdea85c6ed55514c29af2dbf10a272540e8e8760de09c564b288be" Apr 17 20:50:42.358005 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.357982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerStarted","Data":"32dc143b01906141a8b16aaee7daadc80ae322156ea5f5d7d7f4558716c71cc8"} Apr 17 20:50:42.370867 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.370848 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:42.461893 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:42.461862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:42.462421 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:42.462398 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:42.462521 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:42.462476 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret podName:7e49bff1-efa9-421a-8ee8-f431f0f0c109 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:46.46245703 +0000 UTC m=+30.696923430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret") pod "global-pull-secret-syncer-c6jkl" (UID: "7e49bff1-efa9-421a-8ee8-f431f0f0c109") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:43.361231 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.361187 2576 generic.go:358] "Generic (PLEG): container finished" podID="1876416b-79dd-4ff1-88a4-b7111c5e304d" containerID="32dc143b01906141a8b16aaee7daadc80ae322156ea5f5d7d7f4558716c71cc8" exitCode=0 Apr 17 20:50:43.361674 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.361256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerDied","Data":"32dc143b01906141a8b16aaee7daadc80ae322156ea5f5d7d7f4558716c71cc8"} Apr 17 20:50:43.364221 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.364204 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 20:50:43.364562 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.364539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" event={"ID":"82f6c12a-75ed-42b7-8c6c-bc314957ec1f","Type":"ContainerStarted","Data":"d1aceb4e71b04e7336500a4917138a043d0ddeee4a5de5e19a4d1922aa05f49d"} Apr 17 20:50:43.364788 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.364769 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:43.380594 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.380569 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:50:43.391631 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.391579 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" podStartSLOduration=10.094241755 podStartE2EDuration="27.391564191s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.887543048 +0000 UTC m=+3.122009447" lastFinishedPulling="2026-04-17 20:50:36.184865495 +0000 UTC m=+20.419331883" observedRunningTime="2026-04-17 20:50:43.390683245 +0000 UTC m=+27.625149652" watchObservedRunningTime="2026-04-17 20:50:43.391564191 +0000 UTC m=+27.626030604" Apr 17 20:50:43.786738 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.786475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c6jkl"] Apr 17 20:50:43.786870 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.786797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:43.786920 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:43.786878 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c6jkl" podUID="7e49bff1-efa9-421a-8ee8-f431f0f0c109" Apr 17 20:50:43.789204 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.789176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cxq8r"] Apr 17 20:50:43.789317 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.789280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:43.789400 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:43.789382 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:43.797387 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.797342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rjt2l"] Apr 17 20:50:43.797518 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:43.797464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:43.797591 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:43.797553 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:44.368181 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:44.368147 2576 generic.go:358] "Generic (PLEG): container finished" podID="1876416b-79dd-4ff1-88a4-b7111c5e304d" containerID="ec3d3e6f8eb02f4809de46e4fceb3340408e84afcdca754bf3ba8e86a4d914a2" exitCode=0 Apr 17 20:50:44.368654 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:44.368246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerDied","Data":"ec3d3e6f8eb02f4809de46e4fceb3340408e84afcdca754bf3ba8e86a4d914a2"} Apr 17 20:50:45.235747 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:45.235719 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:45.235898 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:45.235795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:45.235934 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:45.235895 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:45.236053 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:45.236032 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c6jkl" podUID="7e49bff1-efa9-421a-8ee8-f431f0f0c109" Apr 17 20:50:46.238900 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:46.238824 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:46.239420 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:46.238950 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:46.377530 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:46.377489 2576 generic.go:358] "Generic (PLEG): container finished" podID="1876416b-79dd-4ff1-88a4-b7111c5e304d" containerID="2cddcd29d861b33756fb52615798587ea073a14d66290e5309afac1fdba9b48c" exitCode=0 Apr 17 20:50:46.377666 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:46.377546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerDied","Data":"2cddcd29d861b33756fb52615798587ea073a14d66290e5309afac1fdba9b48c"} Apr 17 20:50:46.492573 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:46.492486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:46.492728 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:46.492635 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:46.492728 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:46.492693 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret podName:7e49bff1-efa9-421a-8ee8-f431f0f0c109 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:54.492675229 +0000 UTC m=+38.727141634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret") pod "global-pull-secret-syncer-c6jkl" (UID: "7e49bff1-efa9-421a-8ee8-f431f0f0c109") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:47.235646 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:47.235610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:47.235646 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:47.235645 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:47.235879 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:47.235721 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:47.235937 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:47.235869 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c6jkl" podUID="7e49bff1-efa9-421a-8ee8-f431f0f0c109" Apr 17 20:50:48.238701 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:48.238675 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:48.239261 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:48.238817 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:50:49.236105 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:49.236070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:49.236297 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:49.236070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:49.236297 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:49.236216 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rjt2l" podUID="9e571f60-0b76-435b-aac4-aada6990b2b3" Apr 17 20:50:49.236297 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:49.236278 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c6jkl" podUID="7e49bff1-efa9-421a-8ee8-f431f0f0c109" Apr 17 20:50:49.918915 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:49.918814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:49.918915 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:49.918911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:49.919485 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:49.918980 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:49.919485 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:49.919019 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:49.919485 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:49.919035 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:49.919485 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:49.919047 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9dc82 for pod openshift-network-diagnostics/network-check-target-rjt2l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:49.919485 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:49.919053 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:21.919036886 +0000 UTC m=+66.153503293 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:49.919485 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:49.919084 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82 podName:9e571f60-0b76-435b-aac4-aada6990b2b3 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:21.919072736 +0000 UTC m=+66.153539122 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9dc82" (UniqueName: "kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82") pod "network-check-target-rjt2l" (UID: "9e571f60-0b76-435b-aac4-aada6990b2b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:50.093022 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.092993 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-66.ec2.internal" event="NodeReady" Apr 17 20:50:50.093204 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.093147 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:50:50.132290 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.132258 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5bfdd85f76-q4czg"] Apr 17 20:50:50.154782 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.154748 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.155706 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.155676 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bfdd85f76-q4czg"] Apr 17 20:50:50.157318 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.157296 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:50:50.157584 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.157296 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j6pkz\"" Apr 17 20:50:50.157584 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.157323 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:50:50.157584 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.157383 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:50:50.162434 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.162404 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kx6mg"] Apr 17 20:50:50.172294 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.172239 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:50:50.178799 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.178774 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cs29g"] Apr 17 20:50:50.178965 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.178946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.181913 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.181892 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:50:50.182732 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.182713 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:50:50.182956 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.182939 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tswpv\"" Apr 17 20:50:50.190960 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.190935 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dpkwf"] Apr 17 20:50:50.191106 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.191086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:50.193651 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.193630 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 20:50:50.193824 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.193674 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ct2lq\"" Apr 17 20:50:50.193878 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.193853 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 20:50:50.208181 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.208162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cs29g"] Apr 17 20:50:50.208181 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.208184 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kx6mg"] Apr 17 20:50:50.208377 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.208193 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dpkwf"] Apr 17 20:50:50.208377 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.208281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:50.211785 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.211770 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:50:50.212004 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.211990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:50:50.212317 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.212298 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:50:50.212502 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.212478 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jvfcm\"" Apr 17 20:50:50.235871 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.235777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:50:50.239336 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.239311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f22jz\"" Apr 17 20:50:50.239511 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.239325 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.321825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.321902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34199fd4-6043-479b-9af9-8ddbd520a431-ca-trust-extracted\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.321923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-trusted-ca\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.321942 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44w8\" (UniqueName: \"kubernetes.io/projected/3bd339db-87cf-44af-8c74-4c5f57f80ccc-kube-api-access-s44w8\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-image-registry-private-configuration\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dea8814d-46ce-4135-a1ce-f5b8ff97088a-tmp-dir\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c0f5afe7-3f63-49d2-8f14-97de5a47e278-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-bound-sa-token\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-registry-certificates\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dea8814d-46ce-4135-a1ce-f5b8ff97088a-config-volume\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89qhs\" (UniqueName: \"kubernetes.io/projected/dea8814d-46ce-4135-a1ce-f5b8ff97088a-kube-api-access-89qhs\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.322379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-installation-pull-secrets\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.323005 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.322408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6fxz\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-kube-api-access-g6fxz\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423317 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c0f5afe7-3f63-49d2-8f14-97de5a47e278-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:50.423317 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-bound-sa-token\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-registry-certificates\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dea8814d-46ce-4135-a1ce-f5b8ff97088a-config-volume\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89qhs\" (UniqueName: \"kubernetes.io/projected/dea8814d-46ce-4135-a1ce-f5b8ff97088a-kube-api-access-89qhs\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-installation-pull-secrets\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6fxz\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-kube-api-access-g6fxz\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:50.423565 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34199fd4-6043-479b-9af9-8ddbd520a431-ca-trust-extracted\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.423477 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-trusted-ca\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s44w8\" (UniqueName: \"kubernetes.io/projected/3bd339db-87cf-44af-8c74-4c5f57f80ccc-kube-api-access-s44w8\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-image-registry-private-configuration\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.423658 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:50.923638126 +0000 UTC m=+35.158104512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.423694 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.423695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dea8814d-46ce-4135-a1ce-f5b8ff97088a-tmp-dir\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.423741 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:50:50.923729457 +0000 UTC m=+35.158195843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.423782 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:50:50.423868 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.423800 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:50:50.923794276 +0000 UTC m=+35.158260661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:50:50.424225 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.424006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dea8814d-46ce-4135-a1ce-f5b8ff97088a-tmp-dir\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.424225 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.424023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dea8814d-46ce-4135-a1ce-f5b8ff97088a-config-volume\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.424225 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.424057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c0f5afe7-3f63-49d2-8f14-97de5a47e278-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:50.424225 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.424076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34199fd4-6043-479b-9af9-8ddbd520a431-ca-trust-extracted\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.424440 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.424379 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:50:50.424440 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.424401 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:50:50.424542 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.424453 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:50.924438123 +0000 UTC m=+35.158904521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:50:50.424678 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.424635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-registry-certificates\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.425145 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.425122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-trusted-ca\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.428375 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.428335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-image-registry-private-configuration\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.428473 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.428430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-installation-pull-secrets\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.433676 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.433649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89qhs\" (UniqueName: \"kubernetes.io/projected/dea8814d-46ce-4135-a1ce-f5b8ff97088a-kube-api-access-89qhs\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.433780 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.433732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6fxz\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-kube-api-access-g6fxz\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.433994 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.433973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-bound-sa-token\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.435000 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.434978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44w8\" (UniqueName: \"kubernetes.io/projected/3bd339db-87cf-44af-8c74-4c5f57f80ccc-kube-api-access-s44w8\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:50.927643 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.927604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.927696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.927747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.927790 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:50.927810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.927846 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.927866 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:50:51.927848538 +0000 UTC m=+36.162314922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.927877 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.927920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:50:51.927902982 +0000 UTC m=+36.162369373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.927942 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:51.927927053 +0000 UTC m=+36.162393444 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.927947 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.927960 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:50:50.928147 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:50.928011 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:51.92799896 +0000 UTC m=+36.162465367 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:50:51.235669 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.235630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:50:51.235839 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.235816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:51.238349 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.238322 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:50:51.238541 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.238347 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:50:51.239265 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.239243 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:50:51.239397 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.239322 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jrdfc\"" Apr 17 20:50:51.937082 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.937038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.937127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937193 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937211 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937259 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.937194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937281 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:53.937260835 +0000 UTC m=+38.171727240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937300 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937306 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:50:53.937297413 +0000 UTC m=+38.171763798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937403 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:50:53.937392044 +0000 UTC m=+38.171858433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:51.937456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937555 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:50:51.937594 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:51.937588 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:53.937578492 +0000 UTC m=+38.172044882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:50:53.392436 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:53.392402 2576 generic.go:358] "Generic (PLEG): container finished" podID="1876416b-79dd-4ff1-88a4-b7111c5e304d" containerID="80e9de2f0cff206ee90b06ce9d4bda5442ba8884ca9f942c767537e112bd9a20" exitCode=0 Apr 17 20:50:53.392783 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:53.392459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerDied","Data":"80e9de2f0cff206ee90b06ce9d4bda5442ba8884ca9f942c767537e112bd9a20"} Apr 17 20:50:53.951258 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:53.951212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:53.951435 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:53.951292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:53.951435 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:53.951348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:53.951435 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951388 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:50:53.951435 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:53.951419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:53.951649 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951444 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:50:53.951649 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951457 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:50:57.951435389 +0000 UTC m=+42.185901777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:50:53.951649 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951500 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:57.951486069 +0000 UTC m=+42.185952454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:50:53.951649 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951514 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:50:53.951649 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951529 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:50:53.951649 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951528 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:50:53.951649 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951572 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:57.951559463 +0000 UTC m=+42.186025851 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:50:53.951649 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:53.951589 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:50:57.95158011 +0000 UTC m=+42.186046498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:50:54.397811 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.397724 2576 generic.go:358] "Generic (PLEG): container finished" podID="1876416b-79dd-4ff1-88a4-b7111c5e304d" containerID="0e540f753627fa0702ed3988f8af6bc4d045a7e47ef4f06946a76a615d27859b" exitCode=0 Apr 17 20:50:54.397811 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.397776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerDied","Data":"0e540f753627fa0702ed3988f8af6bc4d045a7e47ef4f06946a76a615d27859b"} Apr 17 20:50:54.555839 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.555796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:54.558802 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.558776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e49bff1-efa9-421a-8ee8-f431f0f0c109-original-pull-secret\") pod \"global-pull-secret-syncer-c6jkl\" (UID: \"7e49bff1-efa9-421a-8ee8-f431f0f0c109\") " pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:54.749054 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.749020 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv"] Apr 17 20:50:54.752607 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.752583 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2"] Apr 17 20:50:54.752751 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.752716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" Apr 17 20:50:54.755328 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.755297 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 20:50:54.755461 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.755344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-mfjd6\"" Apr 17 20:50:54.755547 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.755529 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 20:50:54.755949 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.755935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.756263 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.756247 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 20:50:54.756306 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.756294 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 20:50:54.758277 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.758259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 20:50:54.761379 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.761341 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv"] Apr 17 20:50:54.763934 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.763915 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2"] Apr 17 20:50:54.771201 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.771181 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8"] Apr 17 20:50:54.774956 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.774935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.777207 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.777184 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 20:50:54.777339 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.777325 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 20:50:54.777434 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.777370 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 20:50:54.777434 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.777378 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 20:50:54.784221 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.784196 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8"] Apr 17 20:50:54.854149 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.854113 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jkl" Apr 17 20:50:54.857080 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.857133 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22pqm\" (UniqueName: \"kubernetes.io/projected/ec4f595a-d0db-454a-b1a5-99081de08505-kube-api-access-22pqm\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.857133 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ec4f595a-d0db-454a-b1a5-99081de08505-klusterlet-config\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.857255 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-hub\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.857402 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec4f595a-d0db-454a-b1a5-99081de08505-tmp\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.857402 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z5ls\" (UniqueName: \"kubernetes.io/projected/7622310a-8bd8-464e-b6a7-540e2b7b70e2-kube-api-access-9z5ls\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.857514 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzjq\" (UniqueName: \"kubernetes.io/projected/728b2611-0e6b-4cdb-97c4-56695815ae8e-kube-api-access-sdzjq\") pod \"managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv\" (UID: \"728b2611-0e6b-4cdb-97c4-56695815ae8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" Apr 17 20:50:54.857514 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-ca\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.857514 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.857645 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7622310a-8bd8-464e-b6a7-540e2b7b70e2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.857645 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.857555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/728b2611-0e6b-4cdb-97c4-56695815ae8e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv\" (UID: \"728b2611-0e6b-4cdb-97c4-56695815ae8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" Apr 17 20:50:54.958872 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.958839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-ca\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.959024 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.958881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.959024 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.958908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7622310a-8bd8-464e-b6a7-540e2b7b70e2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.959024 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.958930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/728b2611-0e6b-4cdb-97c4-56695815ae8e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv\" (UID: \"728b2611-0e6b-4cdb-97c4-56695815ae8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" Apr 17 20:50:54.959024 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.958982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.959024 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.959017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22pqm\" (UniqueName: \"kubernetes.io/projected/ec4f595a-d0db-454a-b1a5-99081de08505-kube-api-access-22pqm\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.959225 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.959051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ec4f595a-d0db-454a-b1a5-99081de08505-klusterlet-config\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.959225 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.959096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-hub\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.959340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.959322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec4f595a-d0db-454a-b1a5-99081de08505-tmp\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.959418 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.959376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9z5ls\" (UniqueName: \"kubernetes.io/projected/7622310a-8bd8-464e-b6a7-540e2b7b70e2-kube-api-access-9z5ls\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.959470 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.959421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzjq\" (UniqueName: \"kubernetes.io/projected/728b2611-0e6b-4cdb-97c4-56695815ae8e-kube-api-access-sdzjq\") pod \"managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv\" (UID: \"728b2611-0e6b-4cdb-97c4-56695815ae8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" Apr 17 20:50:54.960136 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.959936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7622310a-8bd8-464e-b6a7-540e2b7b70e2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.960324 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.960301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec4f595a-d0db-454a-b1a5-99081de08505-tmp\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.962269 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.962237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-ca\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.962503 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.962473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.962709 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.962678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.963073 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.963052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ec4f595a-d0db-454a-b1a5-99081de08505-klusterlet-config\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.963147 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.963100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/728b2611-0e6b-4cdb-97c4-56695815ae8e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv\" (UID: \"728b2611-0e6b-4cdb-97c4-56695815ae8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" Apr 17 20:50:54.963557 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.963537 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7622310a-8bd8-464e-b6a7-540e2b7b70e2-hub\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.967555 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.967526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z5ls\" (UniqueName: \"kubernetes.io/projected/7622310a-8bd8-464e-b6a7-540e2b7b70e2-kube-api-access-9z5ls\") pod \"cluster-proxy-proxy-agent-6678c54cc7-sv9g8\" (UID: \"7622310a-8bd8-464e-b6a7-540e2b7b70e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:54.967631 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.967555 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22pqm\" (UniqueName: \"kubernetes.io/projected/ec4f595a-d0db-454a-b1a5-99081de08505-kube-api-access-22pqm\") pod \"klusterlet-addon-workmgr-68df47dd47-5hkf2\" (UID: \"ec4f595a-d0db-454a-b1a5-99081de08505\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:54.968189 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.968167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzjq\" (UniqueName: \"kubernetes.io/projected/728b2611-0e6b-4cdb-97c4-56695815ae8e-kube-api-access-sdzjq\") pod \"managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv\" (UID: \"728b2611-0e6b-4cdb-97c4-56695815ae8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" Apr 17 20:50:54.979230 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:54.979205 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c6jkl"] Apr 17 20:50:54.983413 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:54.983391 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e49bff1_efa9_421a_8ee8_f431f0f0c109.slice/crio-723f835dbba84bb5347184b2dd522d22806fe97acf28ae204312ab6a584fa477 WatchSource:0}: Error finding container 723f835dbba84bb5347184b2dd522d22806fe97acf28ae204312ab6a584fa477: Status 404 returned error can't find the container with id 723f835dbba84bb5347184b2dd522d22806fe97acf28ae204312ab6a584fa477 Apr 17 20:50:55.072981 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.072883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" Apr 17 20:50:55.078729 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.078703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:50:55.085554 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.085529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:50:55.233606 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.233404 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv"] Apr 17 20:50:55.236267 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.236240 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2"] Apr 17 20:50:55.236449 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:55.236423 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod728b2611_0e6b_4cdb_97c4_56695815ae8e.slice/crio-1320f81cb09f91139ec80cc394277481ce4407bad234e1fa2197dc2b07768d41 WatchSource:0}: Error finding container 1320f81cb09f91139ec80cc394277481ce4407bad234e1fa2197dc2b07768d41: Status 404 returned error can't find the container with id 1320f81cb09f91139ec80cc394277481ce4407bad234e1fa2197dc2b07768d41 Apr 17 20:50:55.239599 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:55.239578 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec4f595a_d0db_454a_b1a5_99081de08505.slice/crio-e2fc18e2804fd9be986d906e3c9fa05cc7b436c8cad0b8b14328f2b87f7a576f WatchSource:0}: Error finding container e2fc18e2804fd9be986d906e3c9fa05cc7b436c8cad0b8b14328f2b87f7a576f: Status 404 returned error can't find the container with id e2fc18e2804fd9be986d906e3c9fa05cc7b436c8cad0b8b14328f2b87f7a576f Apr 17 20:50:55.252152 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.252109 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8"] Apr 17 20:50:55.256435 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:50:55.256400 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7622310a_8bd8_464e_b6a7_540e2b7b70e2.slice/crio-63e7a32126ae6f98047665e19f1ad1eaadea332b9f774e9c6828dc1f0659d04e WatchSource:0}: Error finding container 63e7a32126ae6f98047665e19f1ad1eaadea332b9f774e9c6828dc1f0659d04e: Status 404 returned error can't find the container with id 63e7a32126ae6f98047665e19f1ad1eaadea332b9f774e9c6828dc1f0659d04e Apr 17 20:50:55.400211 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.400128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" event={"ID":"ec4f595a-d0db-454a-b1a5-99081de08505","Type":"ContainerStarted","Data":"e2fc18e2804fd9be986d906e3c9fa05cc7b436c8cad0b8b14328f2b87f7a576f"} Apr 17 20:50:55.401244 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.401219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c6jkl" event={"ID":"7e49bff1-efa9-421a-8ee8-f431f0f0c109","Type":"ContainerStarted","Data":"723f835dbba84bb5347184b2dd522d22806fe97acf28ae204312ab6a584fa477"} Apr 17 20:50:55.404095 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.404071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" event={"ID":"1876416b-79dd-4ff1-88a4-b7111c5e304d","Type":"ContainerStarted","Data":"3a2cfc8e8f688795df9b368caf61d16a77e37ced0d08615e59c9f4fecea17eff"} Apr 17 20:50:55.405102 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.405073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" event={"ID":"728b2611-0e6b-4cdb-97c4-56695815ae8e","Type":"ContainerStarted","Data":"1320f81cb09f91139ec80cc394277481ce4407bad234e1fa2197dc2b07768d41"} Apr 17 20:50:55.405984 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.405960 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" event={"ID":"7622310a-8bd8-464e-b6a7-540e2b7b70e2","Type":"ContainerStarted","Data":"63e7a32126ae6f98047665e19f1ad1eaadea332b9f774e9c6828dc1f0659d04e"} Apr 17 20:50:55.424350 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:55.424307 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rv6lj" podStartSLOduration=6.035106968 podStartE2EDuration="39.424295968s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:50:18.881297342 +0000 UTC m=+3.115763728" lastFinishedPulling="2026-04-17 20:50:52.270486332 +0000 UTC m=+36.504952728" observedRunningTime="2026-04-17 20:50:55.422953113 +0000 UTC m=+39.657419520" watchObservedRunningTime="2026-04-17 20:50:55.424295968 +0000 UTC m=+39.658762374" Apr 17 20:50:57.989591 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:57.989535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:50:57.990057 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:57.989605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:50:57.990057 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:57.989644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:50:57.990057 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:50:57.989758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:50:57.990057 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.989885 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:50:57.990057 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.989898 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:50:57.990057 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.989949 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:05.989930725 +0000 UTC m=+50.224397124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:50:57.990433 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.990342 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:50:57.990433 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.990403 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:51:05.990392047 +0000 UTC m=+50.224858432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:50:57.990566 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.990464 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:50:57.990566 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.990492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:51:05.990484856 +0000 UTC m=+50.224951245 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:50:57.990566 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.990542 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:50:57.990724 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:50:57.990570 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:05.990560639 +0000 UTC m=+50.225027026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:51:02.426944 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.426885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" event={"ID":"728b2611-0e6b-4cdb-97c4-56695815ae8e","Type":"ContainerStarted","Data":"852404535e848c61ecfe2b0e33dff5d7242c83b28d024b863dd30a4fe0cfbf99"} Apr 17 20:51:02.428304 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.428248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" event={"ID":"7622310a-8bd8-464e-b6a7-540e2b7b70e2","Type":"ContainerStarted","Data":"5d816b153c0c2aaa423bb9b70fd88a53d6bd576c148885a82cde69d3e9436b51"} Apr 17 20:51:02.429392 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.429348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" event={"ID":"ec4f595a-d0db-454a-b1a5-99081de08505","Type":"ContainerStarted","Data":"34605acfb069a82019f0887c25917c4a6ea949bfa104a3ebf516b5728929df51"} Apr 17 20:51:02.430119 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.430098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:51:02.431553 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.431533 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:51:02.432407 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.432384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c6jkl" event={"ID":"7e49bff1-efa9-421a-8ee8-f431f0f0c109","Type":"ContainerStarted","Data":"96331d293ad8f97657b7cd996de138275f2d3d2e19d39f4e1fe2c488e8bcd0b7"} Apr 17 20:51:02.441679 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.441633 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" podStartSLOduration=1.421326832 podStartE2EDuration="8.441621238s" podCreationTimestamp="2026-04-17 20:50:54 +0000 UTC" firstStartedPulling="2026-04-17 20:50:55.238622941 +0000 UTC m=+39.473089327" lastFinishedPulling="2026-04-17 20:51:02.258917345 +0000 UTC m=+46.493383733" observedRunningTime="2026-04-17 20:51:02.441302742 +0000 UTC m=+46.675769150" watchObservedRunningTime="2026-04-17 20:51:02.441621238 +0000 UTC m=+46.676087646" Apr 17 20:51:02.456759 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.456687 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-c6jkl" podStartSLOduration=17.19003161 podStartE2EDuration="24.456674353s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:54.98514343 +0000 UTC m=+39.219609815" lastFinishedPulling="2026-04-17 20:51:02.251786158 +0000 UTC m=+46.486252558" observedRunningTime="2026-04-17 20:51:02.456476111 +0000 UTC m=+46.690942519" watchObservedRunningTime="2026-04-17 20:51:02.456674353 +0000 UTC m=+46.691140759" Apr 17 20:51:02.471277 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:02.471198 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" podStartSLOduration=1.443631503 podStartE2EDuration="8.471186108s" podCreationTimestamp="2026-04-17 20:50:54 +0000 UTC" firstStartedPulling="2026-04-17 20:50:55.241289764 +0000 UTC m=+39.475756149" lastFinishedPulling="2026-04-17 20:51:02.268844357 +0000 UTC m=+46.503310754" observedRunningTime="2026-04-17 20:51:02.470078882 +0000 UTC m=+46.704545285" watchObservedRunningTime="2026-04-17 20:51:02.471186108 +0000 UTC m=+46.705652514" Apr 17 20:51:05.441499 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:05.441459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" event={"ID":"7622310a-8bd8-464e-b6a7-540e2b7b70e2","Type":"ContainerStarted","Data":"a859a362a9a8391284dcb67b1b70f0081eec390a31010d204170a53b2bb356bf"} Apr 17 20:51:05.441499 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:05.441502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" event={"ID":"7622310a-8bd8-464e-b6a7-540e2b7b70e2","Type":"ContainerStarted","Data":"60611ff77fe4095a1ddedbbdc2109bd5ab2151f7b5fecff2d5ef9020ad0af122"} Apr 17 20:51:05.461987 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:05.461944 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" podStartSLOduration=2.248512118 podStartE2EDuration="11.461932001s" podCreationTimestamp="2026-04-17 20:50:54 +0000 UTC" firstStartedPulling="2026-04-17 20:50:55.257572153 +0000 UTC m=+39.492038539" lastFinishedPulling="2026-04-17 20:51:04.47099202 +0000 UTC m=+48.705458422" observedRunningTime="2026-04-17 20:51:05.46105413 +0000 UTC m=+49.695520536" watchObservedRunningTime="2026-04-17 20:51:05.461932001 +0000 UTC m=+49.696398408" Apr 17 20:51:06.053773 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:06.053733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:51:06.053773 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:06.053783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:51:06.054009 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:06.053810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:51:06.054009 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:06.053831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:51:06.054009 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.053867 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:06.054009 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.053885 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:51:06.054009 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.053936 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:22.053920395 +0000 UTC m=+66.288386781 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:51:06.054009 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.053940 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:51:06.054009 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.053960 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:06.054231 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.053940 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:06.054231 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.054031 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:51:22.054013408 +0000 UTC m=+66.288479806 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:51:06.054231 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.054053 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:51:22.054039493 +0000 UTC m=+66.288505892 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:51:06.054231 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:06.054073 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:22.054064548 +0000 UTC m=+66.288530933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:51:15.386780 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:15.386753 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vzmr" Apr 17 20:51:21.970643 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:21.970578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:51:21.970643 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:21.970659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:51:21.973079 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:21.973055 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:51:21.973202 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:21.973093 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:51:21.980764 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:21.980743 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:51:21.980833 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:21.980802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:25.980787149 +0000 UTC m=+130.215253535 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : secret "metrics-daemon-secret" not found Apr 17 20:51:21.982644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:21.982626 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:51:21.994019 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:21.993992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dc82\" (UniqueName: \"kubernetes.io/projected/9e571f60-0b76-435b-aac4-aada6990b2b3-kube-api-access-9dc82\") pod \"network-check-target-rjt2l\" (UID: \"9e571f60-0b76-435b-aac4-aada6990b2b3\") " pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:51:22.071519 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:22.071481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:51:22.071519 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:22.071524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:51:22.071695 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:22.071548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:51:22.071695 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:22.071569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:51:22.071695 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071644 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:51:22.071695 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071648 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:22.071695 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071680 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:51:22.071695 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071698 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:54.071683284 +0000 UTC m=+98.306149683 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:51:22.071919 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071724 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:54.071712245 +0000 UTC m=+98.306178630 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:51:22.071919 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071644 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:22.071919 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071645 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:22.071919 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071791 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:51:54.071779469 +0000 UTC m=+98.306245854 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:51:22.071919 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:22.071812 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:51:54.071803209 +0000 UTC m=+98.306269596 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:51:22.150234 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:22.150201 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jrdfc\"" Apr 17 20:51:22.158096 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:22.158073 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:51:22.271587 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:22.271553 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rjt2l"] Apr 17 20:51:22.274434 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:51:22.274401 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e571f60_0b76_435b_aac4_aada6990b2b3.slice/crio-b8b3ec3c084f074d4fa36b9d6c2b45ea01e42ef9e5ec2c3ae4840033e9c0379c WatchSource:0}: Error finding container b8b3ec3c084f074d4fa36b9d6c2b45ea01e42ef9e5ec2c3ae4840033e9c0379c: Status 404 returned error can't find the container with id b8b3ec3c084f074d4fa36b9d6c2b45ea01e42ef9e5ec2c3ae4840033e9c0379c Apr 17 20:51:22.484830 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:22.484744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rjt2l" event={"ID":"9e571f60-0b76-435b-aac4-aada6990b2b3","Type":"ContainerStarted","Data":"b8b3ec3c084f074d4fa36b9d6c2b45ea01e42ef9e5ec2c3ae4840033e9c0379c"} Apr 17 20:51:25.493128 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:25.493089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rjt2l" event={"ID":"9e571f60-0b76-435b-aac4-aada6990b2b3","Type":"ContainerStarted","Data":"c1821d45d6e05e0725cc674422d454674c99afdd640b41a0a2cbad96faeb63f6"} Apr 17 20:51:25.493592 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:25.493218 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:51:25.509927 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:25.509880 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rjt2l" podStartSLOduration=66.931048527 podStartE2EDuration="1m9.509866549s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:51:22.276302618 +0000 UTC m=+66.510769002" lastFinishedPulling="2026-04-17 20:51:24.855120635 +0000 UTC m=+69.089587024" observedRunningTime="2026-04-17 20:51:25.509451457 +0000 UTC m=+69.743917863" watchObservedRunningTime="2026-04-17 20:51:25.509866549 +0000 UTC m=+69.744332953" Apr 17 20:51:54.120556 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:54.120503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:54.120614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:54.120638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120645 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:54.120656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120710 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:58.120693435 +0000 UTC m=+162.355159820 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120744 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120744 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120807 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:52:58.120790771 +0000 UTC m=+162.355257155 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120826 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:52:58.120815973 +0000 UTC m=+162.355282358 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120845 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120854 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:51:54.121011 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:51:54.120890 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:58.12087797 +0000 UTC m=+162.355344357 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:51:56.498459 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:51:56.498424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rjt2l" Apr 17 20:52:26.065273 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:26.065214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:52:26.065908 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:26.065408 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:52:26.065908 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:26.065509 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs podName:b1da9568-78d7-4d7f-93b4-33b608a48c41 nodeName:}" failed. No retries permitted until 2026-04-17 20:54:28.06548591 +0000 UTC m=+252.299952311 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs") pod "network-metrics-daemon-cxq8r" (UID: "b1da9568-78d7-4d7f-93b4-33b608a48c41") : secret "metrics-daemon-secret" not found Apr 17 20:52:52.946430 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:52.946401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qcw6q_f96c4ba0-6cee-4727-bef7-248a0da4b215/dns-node-resolver/0.log" Apr 17 20:52:53.169934 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:53.169889 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" podUID="34199fd4-6043-479b-9af9-8ddbd520a431" Apr 17 20:52:53.190163 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:53.190137 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-kx6mg" podUID="dea8814d-46ce-4135-a1ce-f5b8ff97088a" Apr 17 20:52:53.200342 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:53.200284 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" podUID="c0f5afe7-3f63-49d2-8f14-97de5a47e278" Apr 17 20:52:53.217462 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:53.217438 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dpkwf" podUID="3bd339db-87cf-44af-8c74-4c5f57f80ccc" Apr 17 20:52:53.249907 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:53.249875 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cxq8r" podUID="b1da9568-78d7-4d7f-93b4-33b608a48c41" Apr 17 20:52:53.702930 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:53.702904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:52:53.703088 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:53.702936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:52:53.703088 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:53.703058 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kx6mg" Apr 17 20:52:53.741854 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:53.741823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f2pm8_bf8a6449-19c6-469b-9f9c-049cc3f220b8/node-ca/0.log" Apr 17 20:52:58.218304 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:58.218257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") pod \"image-registry-5bfdd85f76-q4czg\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:52:58.218304 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:58.218304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:58.218414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218424 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218444 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218449 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bfdd85f76-q4czg: secret "image-registry-tls" not found Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218504 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:52:58.218444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218522 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls podName:34199fd4-6043-479b-9af9-8ddbd520a431 nodeName:}" failed. No retries permitted until 2026-04-17 20:55:00.218500666 +0000 UTC m=+284.452967052 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls") pod "image-registry-5bfdd85f76-q4czg" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431") : secret "image-registry-tls" not found Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218544 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert podName:3bd339db-87cf-44af-8c74-4c5f57f80ccc nodeName:}" failed. No retries permitted until 2026-04-17 20:55:00.21853514 +0000 UTC m=+284.453001527 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert") pod "ingress-canary-dpkwf" (UID: "3bd339db-87cf-44af-8c74-4c5f57f80ccc") : secret "canary-serving-cert" not found Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218559 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218559 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert podName:c0f5afe7-3f63-49d2-8f14-97de5a47e278 nodeName:}" failed. No retries permitted until 2026-04-17 20:55:00.218552074 +0000 UTC m=+284.453018459 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cs29g" (UID: "c0f5afe7-3f63-49d2-8f14-97de5a47e278") : secret "networking-console-plugin-cert" not found Apr 17 20:52:58.218810 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:52:58.218628 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls podName:dea8814d-46ce-4135-a1ce-f5b8ff97088a nodeName:}" failed. No retries permitted until 2026-04-17 20:55:00.218616452 +0000 UTC m=+284.453082837 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls") pod "dns-default-kx6mg" (UID: "dea8814d-46ce-4135-a1ce-f5b8ff97088a") : secret "dns-default-metrics-tls" not found Apr 17 20:53:02.430794 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:02.430736 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" podUID="ec4f595a-d0db-454a-b1a5-99081de08505" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.11:8000/readyz\": dial tcp 10.132.0.11:8000: connect: connection refused" Apr 17 20:53:02.723995 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:02.723965 2576 generic.go:358] "Generic (PLEG): container finished" podID="728b2611-0e6b-4cdb-97c4-56695815ae8e" containerID="852404535e848c61ecfe2b0e33dff5d7242c83b28d024b863dd30a4fe0cfbf99" exitCode=255 Apr 17 20:53:02.724174 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:02.724042 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" event={"ID":"728b2611-0e6b-4cdb-97c4-56695815ae8e","Type":"ContainerDied","Data":"852404535e848c61ecfe2b0e33dff5d7242c83b28d024b863dd30a4fe0cfbf99"} Apr 17 20:53:02.724459 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:02.724433 2576 scope.go:117] "RemoveContainer" containerID="852404535e848c61ecfe2b0e33dff5d7242c83b28d024b863dd30a4fe0cfbf99" Apr 17 20:53:02.725324 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:02.725301 2576 generic.go:358] "Generic (PLEG): container finished" podID="ec4f595a-d0db-454a-b1a5-99081de08505" containerID="34605acfb069a82019f0887c25917c4a6ea949bfa104a3ebf516b5728929df51" exitCode=1 Apr 17 20:53:02.725445 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:02.725337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" event={"ID":"ec4f595a-d0db-454a-b1a5-99081de08505","Type":"ContainerDied","Data":"34605acfb069a82019f0887c25917c4a6ea949bfa104a3ebf516b5728929df51"} Apr 17 20:53:02.725672 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:02.725647 2576 scope.go:117] "RemoveContainer" containerID="34605acfb069a82019f0887c25917c4a6ea949bfa104a3ebf516b5728929df51" Apr 17 20:53:03.729518 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:03.729476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c8f8d4f5-hccrv" event={"ID":"728b2611-0e6b-4cdb-97c4-56695815ae8e","Type":"ContainerStarted","Data":"596b4dd7dae103eacf20beb32a958cf8aa51df06971a8d9861224b614a1aaf5a"} Apr 17 20:53:03.731033 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:03.731007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" event={"ID":"ec4f595a-d0db-454a-b1a5-99081de08505","Type":"ContainerStarted","Data":"4b26dea9bce2447ef02cf433140484079fcf33e8d2902eba87d24ba2b37e602d"} Apr 17 20:53:03.731300 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:03.731282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:53:03.731880 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:03.731863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68df47dd47-5hkf2" Apr 17 20:53:06.236609 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:06.236570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:53:08.235816 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:08.235774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:53:10.748331 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.748299 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dpp5j"] Apr 17 20:53:10.751534 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.751514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.754089 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.754056 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:53:10.754993 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.754966 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:53:10.755102 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.755078 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-j2pjg\"" Apr 17 20:53:10.755274 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.755255 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:53:10.755341 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.755286 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:53:10.760943 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.760912 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dpp5j"] Apr 17 20:53:10.827843 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.827813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjnh\" (UniqueName: \"kubernetes.io/projected/3f030b7e-da00-454b-9b12-10b15cc9e274-kube-api-access-khjnh\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.827843 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.827847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3f030b7e-da00-454b-9b12-10b15cc9e274-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.828105 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.827874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3f030b7e-da00-454b-9b12-10b15cc9e274-crio-socket\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.828105 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.827901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3f030b7e-da00-454b-9b12-10b15cc9e274-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.828105 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.827960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f030b7e-da00-454b-9b12-10b15cc9e274-data-volume\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.928661 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.928627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khjnh\" (UniqueName: \"kubernetes.io/projected/3f030b7e-da00-454b-9b12-10b15cc9e274-kube-api-access-khjnh\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.928661 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.928664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3f030b7e-da00-454b-9b12-10b15cc9e274-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.928893 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.928694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3f030b7e-da00-454b-9b12-10b15cc9e274-crio-socket\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.928893 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.928719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3f030b7e-da00-454b-9b12-10b15cc9e274-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.928893 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.928735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f030b7e-da00-454b-9b12-10b15cc9e274-data-volume\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.928893 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.928841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3f030b7e-da00-454b-9b12-10b15cc9e274-crio-socket\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.929068 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.929054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f030b7e-da00-454b-9b12-10b15cc9e274-data-volume\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.930349 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.930334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3f030b7e-da00-454b-9b12-10b15cc9e274-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.931108 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.931092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3f030b7e-da00-454b-9b12-10b15cc9e274-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:10.949423 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:10.949401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjnh\" (UniqueName: \"kubernetes.io/projected/3f030b7e-da00-454b-9b12-10b15cc9e274-kube-api-access-khjnh\") pod \"insights-runtime-extractor-dpp5j\" (UID: \"3f030b7e-da00-454b-9b12-10b15cc9e274\") " pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:11.061760 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:11.061653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dpp5j" Apr 17 20:53:11.176592 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:11.176550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dpp5j"] Apr 17 20:53:11.180849 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:53:11.180821 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f030b7e_da00_454b_9b12_10b15cc9e274.slice/crio-9f4934bbe2f7fd2513d546f379b7f112128a39af3a2c003df71ed69202af8d66 WatchSource:0}: Error finding container 9f4934bbe2f7fd2513d546f379b7f112128a39af3a2c003df71ed69202af8d66: Status 404 returned error can't find the container with id 9f4934bbe2f7fd2513d546f379b7f112128a39af3a2c003df71ed69202af8d66 Apr 17 20:53:11.750441 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:11.750378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dpp5j" event={"ID":"3f030b7e-da00-454b-9b12-10b15cc9e274","Type":"ContainerStarted","Data":"9d9c9d0d2b9a8db9dfb643a1b64eb925506dcd412312b8b2dff97bc6b3156952"} Apr 17 20:53:11.750441 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:11.750417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dpp5j" event={"ID":"3f030b7e-da00-454b-9b12-10b15cc9e274","Type":"ContainerStarted","Data":"9f4934bbe2f7fd2513d546f379b7f112128a39af3a2c003df71ed69202af8d66"} Apr 17 20:53:12.754969 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:12.754926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dpp5j" event={"ID":"3f030b7e-da00-454b-9b12-10b15cc9e274","Type":"ContainerStarted","Data":"f22da8c4be4e903af0c33fa96ff1194b0ca487fa79d7b024f4fcf2de334e88c6"} Apr 17 20:53:13.759560 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:13.759519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dpp5j" event={"ID":"3f030b7e-da00-454b-9b12-10b15cc9e274","Type":"ContainerStarted","Data":"601bc40f17c0e7c9e40bdcf783f7503242c3f600ca1db20abd38927bf38dca6d"} Apr 17 20:53:13.779764 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:13.779717 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dpp5j" podStartSLOduration=1.970650097 podStartE2EDuration="3.779704653s" podCreationTimestamp="2026-04-17 20:53:10 +0000 UTC" firstStartedPulling="2026-04-17 20:53:11.236976578 +0000 UTC m=+175.471442963" lastFinishedPulling="2026-04-17 20:53:13.046031131 +0000 UTC m=+177.280497519" observedRunningTime="2026-04-17 20:53:13.778224364 +0000 UTC m=+178.012690772" watchObservedRunningTime="2026-04-17 20:53:13.779704653 +0000 UTC m=+178.014171094" Apr 17 20:53:25.086824 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:25.086784 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" podUID="7622310a-8bd8-464e-b6a7-540e2b7b70e2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:53:27.588057 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.588020 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k8wvj"] Apr 17 20:53:27.591213 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.591194 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.594598 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.594574 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:53:27.594598 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.594591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:53:27.594764 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.594609 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:53:27.594764 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.594591 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hmlbz\"" Apr 17 20:53:27.594942 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.594929 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:53:27.595579 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.595564 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:53:27.595579 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.595571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:53:27.657373 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-root\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.657559 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.657559 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-tls\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.657559 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-metrics-client-ca\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.657722 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-wtmp\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.657722 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z977g\" (UniqueName: \"kubernetes.io/projected/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-kube-api-access-z977g\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.657722 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.657722 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-textfile\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.657852 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.657731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-sys\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.758862 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.758829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-wtmp\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.758862 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.758865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z977g\" (UniqueName: \"kubernetes.io/projected/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-kube-api-access-z977g\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759052 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.758905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759052 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.758932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-textfile\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759052 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.758956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-sys\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759052 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.758986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-root\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759052 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.759017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-wtmp\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759052 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.759042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759331 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.759073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-sys\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759331 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.759087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-tls\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759331 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.759097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-root\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759331 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.759135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-metrics-client-ca\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759331 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:53:27.759162 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 20:53:27.759331 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:53:27.759213 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-tls podName:b04ae14e-8db1-4d80-bdfa-810545a2b5ef nodeName:}" failed. No retries permitted until 2026-04-17 20:53:28.259196468 +0000 UTC m=+192.493662867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-tls") pod "node-exporter-k8wvj" (UID: "b04ae14e-8db1-4d80-bdfa-810545a2b5ef") : secret "node-exporter-tls" not found Apr 17 20:53:27.759653 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.759437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-textfile\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.759726 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.759703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.760149 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.760132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-metrics-client-ca\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.761370 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.761338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:27.767173 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:27.767154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z977g\" (UniqueName: \"kubernetes.io/projected/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-kube-api-access-z977g\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:28.263290 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:28.263256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-tls\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:28.265739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:28.265722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b04ae14e-8db1-4d80-bdfa-810545a2b5ef-node-exporter-tls\") pod \"node-exporter-k8wvj\" (UID: \"b04ae14e-8db1-4d80-bdfa-810545a2b5ef\") " pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:28.499677 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:28.499630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k8wvj" Apr 17 20:53:28.507787 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:53:28.507748 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04ae14e_8db1_4d80_bdfa_810545a2b5ef.slice/crio-605b8986e6287cf25c82a82460ffe1313f9a85125a79212ddc0e42ddc5e630b4 WatchSource:0}: Error finding container 605b8986e6287cf25c82a82460ffe1313f9a85125a79212ddc0e42ddc5e630b4: Status 404 returned error can't find the container with id 605b8986e6287cf25c82a82460ffe1313f9a85125a79212ddc0e42ddc5e630b4 Apr 17 20:53:28.795609 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:28.795577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8wvj" event={"ID":"b04ae14e-8db1-4d80-bdfa-810545a2b5ef","Type":"ContainerStarted","Data":"605b8986e6287cf25c82a82460ffe1313f9a85125a79212ddc0e42ddc5e630b4"} Apr 17 20:53:29.799448 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:29.799412 2576 generic.go:358] "Generic (PLEG): container finished" podID="b04ae14e-8db1-4d80-bdfa-810545a2b5ef" containerID="b3a1c7e8eb3fe3be8c52f2e396b17dddc022ed8114fa9e4d7bd3eb031a368b4b" exitCode=0 Apr 17 20:53:29.799923 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:29.799498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8wvj" event={"ID":"b04ae14e-8db1-4d80-bdfa-810545a2b5ef","Type":"ContainerDied","Data":"b3a1c7e8eb3fe3be8c52f2e396b17dddc022ed8114fa9e4d7bd3eb031a368b4b"} Apr 17 20:53:30.806502 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:30.806466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8wvj" event={"ID":"b04ae14e-8db1-4d80-bdfa-810545a2b5ef","Type":"ContainerStarted","Data":"ce047d5f68319bfb4bf9fa47d85d267534fb107a0127c4e9c11e2e50826a7af2"} Apr 17 20:53:30.806502 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:30.806501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8wvj" event={"ID":"b04ae14e-8db1-4d80-bdfa-810545a2b5ef","Type":"ContainerStarted","Data":"01c06eca7d38df8fa2013dfc7be56d71543eda5f713b1f301a731935b1dbba68"} Apr 17 20:53:30.827283 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:30.827233 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k8wvj" podStartSLOduration=3.140067285 podStartE2EDuration="3.827217744s" podCreationTimestamp="2026-04-17 20:53:27 +0000 UTC" firstStartedPulling="2026-04-17 20:53:28.509667343 +0000 UTC m=+192.744133727" lastFinishedPulling="2026-04-17 20:53:29.196817798 +0000 UTC m=+193.431284186" observedRunningTime="2026-04-17 20:53:30.82518079 +0000 UTC m=+195.059647215" watchObservedRunningTime="2026-04-17 20:53:30.827217744 +0000 UTC m=+195.061684152" Apr 17 20:53:35.086830 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:35.086791 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" podUID="7622310a-8bd8-464e-b6a7-540e2b7b70e2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:53:42.945020 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:42.944986 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5bfdd85f76-q4czg"] Apr 17 20:53:42.945418 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:53:42.945210 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" podUID="34199fd4-6043-479b-9af9-8ddbd520a431" Apr 17 20:53:43.836231 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.836196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:53:43.840276 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.840254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:53:43.894947 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.894909 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-trusted-ca\") pod \"34199fd4-6043-479b-9af9-8ddbd520a431\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " Apr 17 20:53:43.894947 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.894949 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6fxz\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-kube-api-access-g6fxz\") pod \"34199fd4-6043-479b-9af9-8ddbd520a431\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " Apr 17 20:53:43.895187 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.894969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-bound-sa-token\") pod \"34199fd4-6043-479b-9af9-8ddbd520a431\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " Apr 17 20:53:43.895187 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.895004 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34199fd4-6043-479b-9af9-8ddbd520a431-ca-trust-extracted\") pod \"34199fd4-6043-479b-9af9-8ddbd520a431\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " Apr 17 20:53:43.895187 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.895061 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-image-registry-private-configuration\") pod \"34199fd4-6043-479b-9af9-8ddbd520a431\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " Apr 17 20:53:43.895187 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.895081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-registry-certificates\") pod \"34199fd4-6043-479b-9af9-8ddbd520a431\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " Apr 17 20:53:43.895187 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.895110 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-installation-pull-secrets\") pod \"34199fd4-6043-479b-9af9-8ddbd520a431\" (UID: \"34199fd4-6043-479b-9af9-8ddbd520a431\") " Apr 17 20:53:43.895481 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.895425 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34199fd4-6043-479b-9af9-8ddbd520a431-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "34199fd4-6043-479b-9af9-8ddbd520a431" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:43.895572 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.895482 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "34199fd4-6043-479b-9af9-8ddbd520a431" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:53:43.895644 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.895623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "34199fd4-6043-479b-9af9-8ddbd520a431" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:53:43.897527 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.897493 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "34199fd4-6043-479b-9af9-8ddbd520a431" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:53:43.897527 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.897503 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-kube-api-access-g6fxz" (OuterVolumeSpecName: "kube-api-access-g6fxz") pod "34199fd4-6043-479b-9af9-8ddbd520a431" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431"). InnerVolumeSpecName "kube-api-access-g6fxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:43.897797 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.897773 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "34199fd4-6043-479b-9af9-8ddbd520a431" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:53:43.897864 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.897804 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "34199fd4-6043-479b-9af9-8ddbd520a431" (UID: "34199fd4-6043-479b-9af9-8ddbd520a431"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:43.995985 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.995951 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-bound-sa-token\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:53:43.995985 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.995981 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34199fd4-6043-479b-9af9-8ddbd520a431-ca-trust-extracted\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:53:43.995985 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.995992 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-image-registry-private-configuration\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:53:43.996433 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.996003 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-registry-certificates\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:53:43.996433 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.996013 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34199fd4-6043-479b-9af9-8ddbd520a431-installation-pull-secrets\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:53:43.996433 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.996023 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34199fd4-6043-479b-9af9-8ddbd520a431-trusted-ca\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:53:43.996433 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:43.996031 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g6fxz\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-kube-api-access-g6fxz\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:53:44.838281 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:44.838240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bfdd85f76-q4czg" Apr 17 20:53:44.866404 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:44.866343 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5bfdd85f76-q4czg"] Apr 17 20:53:44.871924 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:44.871897 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5bfdd85f76-q4czg"] Apr 17 20:53:45.003896 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:45.003867 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34199fd4-6043-479b-9af9-8ddbd520a431-registry-tls\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:53:45.086925 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:45.086887 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" podUID="7622310a-8bd8-464e-b6a7-540e2b7b70e2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:53:45.087065 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:45.086973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" Apr 17 20:53:45.087571 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:45.087539 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a859a362a9a8391284dcb67b1b70f0081eec390a31010d204170a53b2bb356bf"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 20:53:45.087627 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:45.087612 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" podUID="7622310a-8bd8-464e-b6a7-540e2b7b70e2" containerName="service-proxy" containerID="cri-o://a859a362a9a8391284dcb67b1b70f0081eec390a31010d204170a53b2bb356bf" gracePeriod=30 Apr 17 20:53:45.842407 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:45.842370 2576 generic.go:358] "Generic (PLEG): container finished" podID="7622310a-8bd8-464e-b6a7-540e2b7b70e2" containerID="a859a362a9a8391284dcb67b1b70f0081eec390a31010d204170a53b2bb356bf" exitCode=2 Apr 17 20:53:45.842593 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:45.842427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" event={"ID":"7622310a-8bd8-464e-b6a7-540e2b7b70e2","Type":"ContainerDied","Data":"a859a362a9a8391284dcb67b1b70f0081eec390a31010d204170a53b2bb356bf"} Apr 17 20:53:45.842593 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:45.842463 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6678c54cc7-sv9g8" event={"ID":"7622310a-8bd8-464e-b6a7-540e2b7b70e2","Type":"ContainerStarted","Data":"e15fcf75078e2ed1cf31b143ab3c4584f4b54d965fd57eeda108f1636f581dbe"} Apr 17 20:53:46.238739 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:53:46.238699 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34199fd4-6043-479b-9af9-8ddbd520a431" path="/var/lib/kubelet/pods/34199fd4-6043-479b-9af9-8ddbd520a431/volumes" Apr 17 20:54:28.142328 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:28.142274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:54:28.144821 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:28.144795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1da9568-78d7-4d7f-93b4-33b608a48c41-metrics-certs\") pod \"network-metrics-daemon-cxq8r\" (UID: \"b1da9568-78d7-4d7f-93b4-33b608a48c41\") " pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:54:28.439566 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:28.439539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f22jz\"" Apr 17 20:54:28.448432 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:28.448409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxq8r" Apr 17 20:54:28.564839 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:28.564806 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cxq8r"] Apr 17 20:54:28.568063 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:54:28.568017 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1da9568_78d7_4d7f_93b4_33b608a48c41.slice/crio-eb972b31557d05310649ff95b3639e70f3ca76245182cea3f77c9d5f93bbddf4 WatchSource:0}: Error finding container eb972b31557d05310649ff95b3639e70f3ca76245182cea3f77c9d5f93bbddf4: Status 404 returned error can't find the container with id eb972b31557d05310649ff95b3639e70f3ca76245182cea3f77c9d5f93bbddf4 Apr 17 20:54:28.952248 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:28.952210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxq8r" event={"ID":"b1da9568-78d7-4d7f-93b4-33b608a48c41","Type":"ContainerStarted","Data":"eb972b31557d05310649ff95b3639e70f3ca76245182cea3f77c9d5f93bbddf4"} Apr 17 20:54:29.958191 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:29.958156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxq8r" event={"ID":"b1da9568-78d7-4d7f-93b4-33b608a48c41","Type":"ContainerStarted","Data":"dba94748519d66397eb69300d909c32f4e9caafcd1570cc122b03c581ed9a0fc"} Apr 17 20:54:29.958191 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:29.958196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxq8r" event={"ID":"b1da9568-78d7-4d7f-93b4-33b608a48c41","Type":"ContainerStarted","Data":"36ed70f3e855156727c2ed9dc067b26d88aa94412213f31f478b97dda55abf10"} Apr 17 20:54:29.973405 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:29.973345 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cxq8r" podStartSLOduration=252.992253182 podStartE2EDuration="4m13.973333144s" podCreationTimestamp="2026-04-17 20:50:16 +0000 UTC" firstStartedPulling="2026-04-17 20:54:28.569747408 +0000 UTC m=+252.804213793" lastFinishedPulling="2026-04-17 20:54:29.550827369 +0000 UTC m=+253.785293755" observedRunningTime="2026-04-17 20:54:29.971772017 +0000 UTC m=+254.206238439" watchObservedRunningTime="2026-04-17 20:54:29.973333144 +0000 UTC m=+254.207799552" Apr 17 20:54:56.703509 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:54:56.703459 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" podUID="c0f5afe7-3f63-49d2-8f14-97de5a47e278" Apr 17 20:54:56.703509 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:54:56.703466 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-kx6mg" podUID="dea8814d-46ce-4135-a1ce-f5b8ff97088a" Apr 17 20:54:57.022532 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:57.022449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kx6mg" Apr 17 20:54:57.022532 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:54:57.022493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:55:00.290024 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.289978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:55:00.290024 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.290031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:55:00.290691 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.290065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:55:00.292778 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.292744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dea8814d-46ce-4135-a1ce-f5b8ff97088a-metrics-tls\") pod \"dns-default-kx6mg\" (UID: \"dea8814d-46ce-4135-a1ce-f5b8ff97088a\") " pod="openshift-dns/dns-default-kx6mg" Apr 17 20:55:00.292778 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.292765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bd339db-87cf-44af-8c74-4c5f57f80ccc-cert\") pod \"ingress-canary-dpkwf\" (UID: \"3bd339db-87cf-44af-8c74-4c5f57f80ccc\") " pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:55:00.292922 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.292806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0f5afe7-3f63-49d2-8f14-97de5a47e278-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cs29g\" (UID: \"c0f5afe7-3f63-49d2-8f14-97de5a47e278\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:55:00.326683 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.326648 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tswpv\"" Apr 17 20:55:00.327436 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.327420 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ct2lq\"" Apr 17 20:55:00.334273 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.334255 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" Apr 17 20:55:00.334384 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.334276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kx6mg" Apr 17 20:55:00.439309 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.439285 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jvfcm\"" Apr 17 20:55:00.447171 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.447145 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dpkwf" Apr 17 20:55:00.457399 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.457346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cs29g"] Apr 17 20:55:00.460570 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:55:00.460515 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0f5afe7_3f63_49d2_8f14_97de5a47e278.slice/crio-c51cd4134b1f7d37ac9604b232ceea3e1265de1d51a7837871fe59faa11b4488 WatchSource:0}: Error finding container c51cd4134b1f7d37ac9604b232ceea3e1265de1d51a7837871fe59faa11b4488: Status 404 returned error can't find the container with id c51cd4134b1f7d37ac9604b232ceea3e1265de1d51a7837871fe59faa11b4488 Apr 17 20:55:00.478001 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.477972 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kx6mg"] Apr 17 20:55:00.481929 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:55:00.481894 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea8814d_46ce_4135_a1ce_f5b8ff97088a.slice/crio-b157fd6f77d9febaa888bec0649a0997cb83982f77df4e6b1378f8d05f6e8452 WatchSource:0}: Error finding container b157fd6f77d9febaa888bec0649a0997cb83982f77df4e6b1378f8d05f6e8452: Status 404 returned error can't find the container with id b157fd6f77d9febaa888bec0649a0997cb83982f77df4e6b1378f8d05f6e8452 Apr 17 20:55:00.565622 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:00.565548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dpkwf"] Apr 17 20:55:00.568609 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:55:00.568580 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd339db_87cf_44af_8c74_4c5f57f80ccc.slice/crio-698b587067c2a7deb0fb598144cb2a2c3eccdc4b645b56c8be1536bff012d511 WatchSource:0}: Error finding container 698b587067c2a7deb0fb598144cb2a2c3eccdc4b645b56c8be1536bff012d511: Status 404 returned error can't find the container with id 698b587067c2a7deb0fb598144cb2a2c3eccdc4b645b56c8be1536bff012d511 Apr 17 20:55:01.033230 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:01.033180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kx6mg" event={"ID":"dea8814d-46ce-4135-a1ce-f5b8ff97088a","Type":"ContainerStarted","Data":"b157fd6f77d9febaa888bec0649a0997cb83982f77df4e6b1378f8d05f6e8452"} Apr 17 20:55:01.034375 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:01.034312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" event={"ID":"c0f5afe7-3f63-49d2-8f14-97de5a47e278","Type":"ContainerStarted","Data":"c51cd4134b1f7d37ac9604b232ceea3e1265de1d51a7837871fe59faa11b4488"} Apr 17 20:55:01.035412 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:01.035387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dpkwf" event={"ID":"3bd339db-87cf-44af-8c74-4c5f57f80ccc","Type":"ContainerStarted","Data":"698b587067c2a7deb0fb598144cb2a2c3eccdc4b645b56c8be1536bff012d511"} Apr 17 20:55:02.039634 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:02.039581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" event={"ID":"c0f5afe7-3f63-49d2-8f14-97de5a47e278","Type":"ContainerStarted","Data":"d18ce65d595918d87d326beda60717a637027a8f030b82c72bc0cfc089111e93"} Apr 17 20:55:02.055492 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:02.055430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cs29g" podStartSLOduration=264.889745485 podStartE2EDuration="4m26.055410846s" podCreationTimestamp="2026-04-17 20:50:36 +0000 UTC" firstStartedPulling="2026-04-17 20:55:00.462836665 +0000 UTC m=+284.697303050" lastFinishedPulling="2026-04-17 20:55:01.628502022 +0000 UTC m=+285.862968411" observedRunningTime="2026-04-17 20:55:02.053552672 +0000 UTC m=+286.288019080" watchObservedRunningTime="2026-04-17 20:55:02.055410846 +0000 UTC m=+286.289877269" Apr 17 20:55:03.044007 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:03.043974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kx6mg" event={"ID":"dea8814d-46ce-4135-a1ce-f5b8ff97088a","Type":"ContainerStarted","Data":"0eeab4cf6acdc37a49ff8985424997320cef0ecb44c654c9814b14170a2e63d4"} Apr 17 20:55:03.044007 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:03.044011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kx6mg" event={"ID":"dea8814d-46ce-4135-a1ce-f5b8ff97088a","Type":"ContainerStarted","Data":"7143c8e416c560dd302167c390648bf25ac1716cf2ae1107fd03f38dd9ca26e6"} Apr 17 20:55:03.044542 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:03.044063 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kx6mg" Apr 17 20:55:03.045340 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:03.045320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dpkwf" event={"ID":"3bd339db-87cf-44af-8c74-4c5f57f80ccc","Type":"ContainerStarted","Data":"1d764fcfbdfa34e9bd64082e6c51bd08c3374a41251bc5f49702d054e5265e68"} Apr 17 20:55:03.060122 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:03.060077 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kx6mg" podStartSLOduration=250.945984114 podStartE2EDuration="4m13.060064609s" podCreationTimestamp="2026-04-17 20:50:50 +0000 UTC" firstStartedPulling="2026-04-17 20:55:00.483527867 +0000 UTC m=+284.717994254" lastFinishedPulling="2026-04-17 20:55:02.597608349 +0000 UTC m=+286.832074749" observedRunningTime="2026-04-17 20:55:03.058580778 +0000 UTC m=+287.293047185" watchObservedRunningTime="2026-04-17 20:55:03.060064609 +0000 UTC m=+287.294531015" Apr 17 20:55:03.072710 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:03.072672 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dpkwf" podStartSLOduration=251.041801573 podStartE2EDuration="4m13.072659387s" podCreationTimestamp="2026-04-17 20:50:50 +0000 UTC" firstStartedPulling="2026-04-17 20:55:00.570420351 +0000 UTC m=+284.804886735" lastFinishedPulling="2026-04-17 20:55:02.60127815 +0000 UTC m=+286.835744549" observedRunningTime="2026-04-17 20:55:03.071603552 +0000 UTC m=+287.306069958" watchObservedRunningTime="2026-04-17 20:55:03.072659387 +0000 UTC m=+287.307125793" Apr 17 20:55:13.050967 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:13.050933 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kx6mg" Apr 17 20:55:16.170941 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:16.170911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 20:55:16.171337 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:16.171316 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 20:55:16.176278 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:55:16.176254 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:57:55.762613 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.762579 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-6zn7h"] Apr 17 20:57:55.765609 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.765590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-6zn7h" Apr 17 20:57:55.767884 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.767862 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 20:57:55.768029 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.767862 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-gpvzd\"" Apr 17 20:57:55.768664 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.768636 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 20:57:55.771626 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.771606 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-6zn7h"] Apr 17 20:57:55.811549 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.811511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrfh9\" (UniqueName: \"kubernetes.io/projected/2733f315-b45a-4d05-b426-13bc79452ef9-kube-api-access-lrfh9\") pod \"cert-manager-759f64656b-6zn7h\" (UID: \"2733f315-b45a-4d05-b426-13bc79452ef9\") " pod="cert-manager/cert-manager-759f64656b-6zn7h" Apr 17 20:57:55.811708 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.811559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2733f315-b45a-4d05-b426-13bc79452ef9-bound-sa-token\") pod \"cert-manager-759f64656b-6zn7h\" (UID: \"2733f315-b45a-4d05-b426-13bc79452ef9\") " pod="cert-manager/cert-manager-759f64656b-6zn7h" Apr 17 20:57:55.912489 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.912438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrfh9\" (UniqueName: \"kubernetes.io/projected/2733f315-b45a-4d05-b426-13bc79452ef9-kube-api-access-lrfh9\") pod \"cert-manager-759f64656b-6zn7h\" (UID: \"2733f315-b45a-4d05-b426-13bc79452ef9\") " pod="cert-manager/cert-manager-759f64656b-6zn7h" Apr 17 20:57:55.912489 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.912501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2733f315-b45a-4d05-b426-13bc79452ef9-bound-sa-token\") pod \"cert-manager-759f64656b-6zn7h\" (UID: \"2733f315-b45a-4d05-b426-13bc79452ef9\") " pod="cert-manager/cert-manager-759f64656b-6zn7h" Apr 17 20:57:55.919932 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.919898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2733f315-b45a-4d05-b426-13bc79452ef9-bound-sa-token\") pod \"cert-manager-759f64656b-6zn7h\" (UID: \"2733f315-b45a-4d05-b426-13bc79452ef9\") " pod="cert-manager/cert-manager-759f64656b-6zn7h" Apr 17 20:57:55.919932 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:55.919911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrfh9\" (UniqueName: \"kubernetes.io/projected/2733f315-b45a-4d05-b426-13bc79452ef9-kube-api-access-lrfh9\") pod \"cert-manager-759f64656b-6zn7h\" (UID: \"2733f315-b45a-4d05-b426-13bc79452ef9\") " pod="cert-manager/cert-manager-759f64656b-6zn7h" Apr 17 20:57:56.075053 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:56.074959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-6zn7h" Apr 17 20:57:56.189062 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:56.189012 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-6zn7h"] Apr 17 20:57:56.192167 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:57:56.192135 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2733f315_b45a_4d05_b426_13bc79452ef9.slice/crio-ea3bf7f9d41c944ed58b3a89db6522453d9ec136813f56d66db658cd72c5334c WatchSource:0}: Error finding container ea3bf7f9d41c944ed58b3a89db6522453d9ec136813f56d66db658cd72c5334c: Status 404 returned error can't find the container with id ea3bf7f9d41c944ed58b3a89db6522453d9ec136813f56d66db658cd72c5334c Apr 17 20:57:56.193933 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:56.193916 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:57:56.492862 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:57:56.492819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-6zn7h" event={"ID":"2733f315-b45a-4d05-b426-13bc79452ef9","Type":"ContainerStarted","Data":"ea3bf7f9d41c944ed58b3a89db6522453d9ec136813f56d66db658cd72c5334c"} Apr 17 20:58:01.508457 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:01.508418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-6zn7h" event={"ID":"2733f315-b45a-4d05-b426-13bc79452ef9","Type":"ContainerStarted","Data":"eab4cfb0c2b440f9623d4a7bc83476ae24ebf644e77bc2d42b821bf253fa095f"} Apr 17 20:58:01.525397 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:01.525320 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-6zn7h" podStartSLOduration=2.008398504 podStartE2EDuration="6.525302227s" podCreationTimestamp="2026-04-17 20:57:55 +0000 UTC" firstStartedPulling="2026-04-17 20:57:56.194043096 +0000 UTC m=+460.428509481" lastFinishedPulling="2026-04-17 20:58:00.710946816 +0000 UTC m=+464.945413204" observedRunningTime="2026-04-17 20:58:01.524416612 +0000 UTC m=+465.758883019" watchObservedRunningTime="2026-04-17 20:58:01.525302227 +0000 UTC m=+465.759768634" Apr 17 20:58:04.776104 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.776071 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79"] Apr 17 20:58:04.779130 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.779113 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.782745 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.782722 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:58:04.782869 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.782741 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:58:04.782869 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.782746 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 20:58:04.782869 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.782813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-rkdwc\"" Apr 17 20:58:04.782869 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.782771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 20:58:04.783113 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.783097 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 20:58:04.787653 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.787626 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79"] Apr 17 20:58:04.880210 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.880168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-metrics-cert\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.880430 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.880212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-manager-config\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.880430 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.880245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9whv\" (UniqueName: \"kubernetes.io/projected/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-kube-api-access-t9whv\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.880430 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.880278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-cert\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.981583 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.981546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-cert\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.981737 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.981617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-metrics-cert\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.981737 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.981646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-manager-config\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.981737 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.981667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9whv\" (UniqueName: \"kubernetes.io/projected/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-kube-api-access-t9whv\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.982352 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.982333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-manager-config\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.984257 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.984235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-metrics-cert\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.984349 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.984276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-cert\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:04.990835 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:04.990812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9whv\" (UniqueName: \"kubernetes.io/projected/5ce3fe5b-22e6-491f-9eed-cb71671ee9c0-kube-api-access-t9whv\") pod \"lws-controller-manager-7bd8bcccff-fqj79\" (UID: \"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:05.089462 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:05.089390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:05.212528 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:05.212498 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79"] Apr 17 20:58:05.216208 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:58:05.216180 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce3fe5b_22e6_491f_9eed_cb71671ee9c0.slice/crio-94324902b5013e4d1f539323b233654dd2cdf7ad3dd2896b06aacd3d7c1b6e44 WatchSource:0}: Error finding container 94324902b5013e4d1f539323b233654dd2cdf7ad3dd2896b06aacd3d7c1b6e44: Status 404 returned error can't find the container with id 94324902b5013e4d1f539323b233654dd2cdf7ad3dd2896b06aacd3d7c1b6e44 Apr 17 20:58:05.520132 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:05.520089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" event={"ID":"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0","Type":"ContainerStarted","Data":"94324902b5013e4d1f539323b233654dd2cdf7ad3dd2896b06aacd3d7c1b6e44"} Apr 17 20:58:09.532082 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.532041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" event={"ID":"5ce3fe5b-22e6-491f-9eed-cb71671ee9c0","Type":"ContainerStarted","Data":"193bbe57db1ad1862ecbb47d8f91c1307461a0f1630911b09ffa7c7a17aca454"} Apr 17 20:58:09.532573 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.532149 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:09.554598 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.554544 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" podStartSLOduration=1.459352258 podStartE2EDuration="5.554525299s" podCreationTimestamp="2026-04-17 20:58:04 +0000 UTC" firstStartedPulling="2026-04-17 20:58:05.217969729 +0000 UTC m=+469.452436114" lastFinishedPulling="2026-04-17 20:58:09.313142766 +0000 UTC m=+473.547609155" observedRunningTime="2026-04-17 20:58:09.552579073 +0000 UTC m=+473.787045492" watchObservedRunningTime="2026-04-17 20:58:09.554525299 +0000 UTC m=+473.788991708" Apr 17 20:58:09.752616 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.752582 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj"] Apr 17 20:58:09.755629 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.755611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.758293 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.758272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:58:09.758449 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.758400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:58:09.759474 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.759459 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:58:09.759551 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.759465 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:58:09.760010 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.759996 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jpcvb\"" Apr 17 20:58:09.770061 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.770039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj"] Apr 17 20:58:09.819453 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.819423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37e19582-7399-462d-8bb9-575954b406de-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.819586 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.819460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqczm\" (UniqueName: \"kubernetes.io/projected/37e19582-7399-462d-8bb9-575954b406de-kube-api-access-sqczm\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.819586 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.819534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37e19582-7399-462d-8bb9-575954b406de-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.920696 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.920665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37e19582-7399-462d-8bb9-575954b406de-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.920825 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.920704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqczm\" (UniqueName: \"kubernetes.io/projected/37e19582-7399-462d-8bb9-575954b406de-kube-api-access-sqczm\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.920825 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.920741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37e19582-7399-462d-8bb9-575954b406de-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.923252 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.923225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37e19582-7399-462d-8bb9-575954b406de-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.923371 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.923340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37e19582-7399-462d-8bb9-575954b406de-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:09.930022 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:09.930002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqczm\" (UniqueName: \"kubernetes.io/projected/37e19582-7399-462d-8bb9-575954b406de-kube-api-access-sqczm\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj\" (UID: \"37e19582-7399-462d-8bb9-575954b406de\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:10.065233 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:10.065151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:10.186158 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:10.186126 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj"] Apr 17 20:58:10.190015 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:58:10.189986 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e19582_7399_462d_8bb9_575954b406de.slice/crio-019cae7698ad86699a28c220cbc14a8108f5acce7324310f91d7fef9706fb7cd WatchSource:0}: Error finding container 019cae7698ad86699a28c220cbc14a8108f5acce7324310f91d7fef9706fb7cd: Status 404 returned error can't find the container with id 019cae7698ad86699a28c220cbc14a8108f5acce7324310f91d7fef9706fb7cd Apr 17 20:58:10.535520 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:10.535481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" event={"ID":"37e19582-7399-462d-8bb9-575954b406de","Type":"ContainerStarted","Data":"019cae7698ad86699a28c220cbc14a8108f5acce7324310f91d7fef9706fb7cd"} Apr 17 20:58:13.545089 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:13.545050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" event={"ID":"37e19582-7399-462d-8bb9-575954b406de","Type":"ContainerStarted","Data":"6bf400d65da144b5420405ede1953eeb190ab079821aef6e6cb5c38bb2c65ab8"} Apr 17 20:58:13.545496 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:13.545209 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:58:13.569034 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:13.568987 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" podStartSLOduration=2.041730695 podStartE2EDuration="4.568974307s" podCreationTimestamp="2026-04-17 20:58:09 +0000 UTC" firstStartedPulling="2026-04-17 20:58:10.191907708 +0000 UTC m=+474.426374093" lastFinishedPulling="2026-04-17 20:58:12.71915131 +0000 UTC m=+476.953617705" observedRunningTime="2026-04-17 20:58:13.567849889 +0000 UTC m=+477.802316296" watchObservedRunningTime="2026-04-17 20:58:13.568974307 +0000 UTC m=+477.803440715" Apr 17 20:58:20.537749 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:20.537717 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-fqj79" Apr 17 20:58:24.549583 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:58:24.549549 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj" Apr 17 20:59:18.930239 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:18.930159 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs"] Apr 17 20:59:18.933401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:18.933380 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:18.935822 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:18.935797 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:59:18.935968 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:18.935797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-nn459\"" Apr 17 20:59:18.935968 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:18.935809 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 20:59:18.936781 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:18.936763 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:59:18.944562 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:18.944542 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs"] Apr 17 20:59:19.045267 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gq92\" (UniqueName: \"kubernetes.io/projected/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-kube-api-access-2gq92\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.045459 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.045459 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.045459 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.045459 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.045459 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.045663 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.045663 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.045663 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.045607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146287 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gq92\" (UniqueName: \"kubernetes.io/projected/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-kube-api-access-2gq92\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146476 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146476 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146476 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146590 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146590 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146590 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146590 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146841 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146841 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146841 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.146997 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.146898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.147073 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.147037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.147256 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.147238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.148976 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.148953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.149317 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.149295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.166030 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.166000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gq92\" (UniqueName: \"kubernetes.io/projected/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-kube-api-access-2gq92\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.168550 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.168533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1c2464fc-7069-43f6-b2cf-7e7d792a9f27-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs\" (UID: \"1c2464fc-7069-43f6-b2cf-7e7d792a9f27\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.245401 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.245373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:19.368857 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.368823 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs"] Apr 17 20:59:19.372241 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:59:19.372213 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c2464fc_7069_43f6_b2cf_7e7d792a9f27.slice/crio-ed936e0513302fbdb90dd527c3e92b0bfc16535b7356fe9cc136bfbe1ee8177a WatchSource:0}: Error finding container ed936e0513302fbdb90dd527c3e92b0bfc16535b7356fe9cc136bfbe1ee8177a: Status 404 returned error can't find the container with id ed936e0513302fbdb90dd527c3e92b0bfc16535b7356fe9cc136bfbe1ee8177a Apr 17 20:59:19.721450 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:19.721417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" event={"ID":"1c2464fc-7069-43f6-b2cf-7e7d792a9f27","Type":"ContainerStarted","Data":"ed936e0513302fbdb90dd527c3e92b0bfc16535b7356fe9cc136bfbe1ee8177a"} Apr 17 20:59:22.002996 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:22.002938 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 17 20:59:22.003288 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:22.003013 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 17 20:59:22.003288 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:22.003039 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 17 20:59:22.734117 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:22.734077 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" event={"ID":"1c2464fc-7069-43f6-b2cf-7e7d792a9f27","Type":"ContainerStarted","Data":"ea2360ff6b7454ee1f6ae5b73d977559bb936b96b487f4f99cfd07680a85fcce"} Apr 17 20:59:22.760998 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:22.760946 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" podStartSLOduration=2.132426686 podStartE2EDuration="4.760931729s" podCreationTimestamp="2026-04-17 20:59:18 +0000 UTC" firstStartedPulling="2026-04-17 20:59:19.374199724 +0000 UTC m=+543.608666109" lastFinishedPulling="2026-04-17 20:59:22.002704768 +0000 UTC m=+546.237171152" observedRunningTime="2026-04-17 20:59:22.759648235 +0000 UTC m=+546.994114642" watchObservedRunningTime="2026-04-17 20:59:22.760931729 +0000 UTC m=+546.995398135" Apr 17 20:59:23.246504 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:23.246464 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:23.250995 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:23.250971 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:23.737427 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:23.737396 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:23.738406 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:23.738384 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs" Apr 17 20:59:27.494631 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.492661 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z8rk5"] Apr 17 20:59:27.497132 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.497105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" Apr 17 20:59:27.500884 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.500855 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:59:27.501010 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.500885 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-jblgt\"" Apr 17 20:59:27.501861 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.501842 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:59:27.504643 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.504623 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z8rk5"] Apr 17 20:59:27.614503 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.614466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6dz\" (UniqueName: \"kubernetes.io/projected/d6feb217-d914-46ec-b7e1-242a7488d89a-kube-api-access-ss6dz\") pod \"kuadrant-operator-catalog-z8rk5\" (UID: \"d6feb217-d914-46ec-b7e1-242a7488d89a\") " pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" Apr 17 20:59:27.715575 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.715533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6dz\" (UniqueName: \"kubernetes.io/projected/d6feb217-d914-46ec-b7e1-242a7488d89a-kube-api-access-ss6dz\") pod \"kuadrant-operator-catalog-z8rk5\" (UID: \"d6feb217-d914-46ec-b7e1-242a7488d89a\") " pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" Apr 17 20:59:27.723592 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.723555 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6dz\" (UniqueName: \"kubernetes.io/projected/d6feb217-d914-46ec-b7e1-242a7488d89a-kube-api-access-ss6dz\") pod \"kuadrant-operator-catalog-z8rk5\" (UID: \"d6feb217-d914-46ec-b7e1-242a7488d89a\") " pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" Apr 17 20:59:27.807197 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.807117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" Apr 17 20:59:27.852174 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.852135 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z8rk5"] Apr 17 20:59:27.924545 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:27.924424 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z8rk5"] Apr 17 20:59:27.927345 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:59:27.927317 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6feb217_d914_46ec_b7e1_242a7488d89a.slice/crio-14b74f7e89383425f932c459bcf85af7eff6e7e75af507419c63b6f0b77d5c52 WatchSource:0}: Error finding container 14b74f7e89383425f932c459bcf85af7eff6e7e75af507419c63b6f0b77d5c52: Status 404 returned error can't find the container with id 14b74f7e89383425f932c459bcf85af7eff6e7e75af507419c63b6f0b77d5c52 Apr 17 20:59:28.060111 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.060024 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hqrnw"] Apr 17 20:59:28.064515 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.064490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 20:59:28.069446 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.069419 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hqrnw"] Apr 17 20:59:28.118931 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.118900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx87h\" (UniqueName: \"kubernetes.io/projected/6f5eb69d-710e-4362-ba09-d3f4c87c5124-kube-api-access-zx87h\") pod \"kuadrant-operator-catalog-hqrnw\" (UID: \"6f5eb69d-710e-4362-ba09-d3f4c87c5124\") " pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 20:59:28.219305 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.219270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx87h\" (UniqueName: \"kubernetes.io/projected/6f5eb69d-710e-4362-ba09-d3f4c87c5124-kube-api-access-zx87h\") pod \"kuadrant-operator-catalog-hqrnw\" (UID: \"6f5eb69d-710e-4362-ba09-d3f4c87c5124\") " pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 20:59:28.227742 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.227713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx87h\" (UniqueName: \"kubernetes.io/projected/6f5eb69d-710e-4362-ba09-d3f4c87c5124-kube-api-access-zx87h\") pod \"kuadrant-operator-catalog-hqrnw\" (UID: \"6f5eb69d-710e-4362-ba09-d3f4c87c5124\") " pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 20:59:28.375184 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.375109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 20:59:28.503659 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.503627 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hqrnw"] Apr 17 20:59:28.571219 ip-10-0-130-66 kubenswrapper[2576]: W0417 20:59:28.571177 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f5eb69d_710e_4362_ba09_d3f4c87c5124.slice/crio-540eee338f60da28be5daaaa0c8df9d3cb0b764acdba7a780b9a2cdbf1a37448 WatchSource:0}: Error finding container 540eee338f60da28be5daaaa0c8df9d3cb0b764acdba7a780b9a2cdbf1a37448: Status 404 returned error can't find the container with id 540eee338f60da28be5daaaa0c8df9d3cb0b764acdba7a780b9a2cdbf1a37448 Apr 17 20:59:28.754004 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.753964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" event={"ID":"d6feb217-d914-46ec-b7e1-242a7488d89a","Type":"ContainerStarted","Data":"14b74f7e89383425f932c459bcf85af7eff6e7e75af507419c63b6f0b77d5c52"} Apr 17 20:59:28.755479 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:28.755021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" event={"ID":"6f5eb69d-710e-4362-ba09-d3f4c87c5124","Type":"ContainerStarted","Data":"540eee338f60da28be5daaaa0c8df9d3cb0b764acdba7a780b9a2cdbf1a37448"} Apr 17 20:59:30.763255 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:30.763210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" event={"ID":"d6feb217-d914-46ec-b7e1-242a7488d89a","Type":"ContainerStarted","Data":"0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a"} Apr 17 20:59:30.763255 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:30.763247 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" podUID="d6feb217-d914-46ec-b7e1-242a7488d89a" containerName="registry-server" containerID="cri-o://0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a" gracePeriod=2 Apr 17 20:59:30.764604 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:30.764579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" event={"ID":"6f5eb69d-710e-4362-ba09-d3f4c87c5124","Type":"ContainerStarted","Data":"7d546bd6c8148788e431b1f1d609b4453172ac10b597d3b9dd61e85f0c8c9383"} Apr 17 20:59:30.786143 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:30.786102 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" podStartSLOduration=1.831850557 podStartE2EDuration="3.786089822s" podCreationTimestamp="2026-04-17 20:59:27 +0000 UTC" firstStartedPulling="2026-04-17 20:59:27.928662553 +0000 UTC m=+552.163128939" lastFinishedPulling="2026-04-17 20:59:29.882901819 +0000 UTC m=+554.117368204" observedRunningTime="2026-04-17 20:59:30.784513996 +0000 UTC m=+555.018980402" watchObservedRunningTime="2026-04-17 20:59:30.786089822 +0000 UTC m=+555.020556228" Apr 17 20:59:30.798590 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:30.798552 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" podStartSLOduration=1.510647179 podStartE2EDuration="2.798538367s" podCreationTimestamp="2026-04-17 20:59:28 +0000 UTC" firstStartedPulling="2026-04-17 20:59:28.572573765 +0000 UTC m=+552.807040149" lastFinishedPulling="2026-04-17 20:59:29.860464945 +0000 UTC m=+554.094931337" observedRunningTime="2026-04-17 20:59:30.797847336 +0000 UTC m=+555.032313743" watchObservedRunningTime="2026-04-17 20:59:30.798538367 +0000 UTC m=+555.033004773" Apr 17 20:59:31.003662 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.003638 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" Apr 17 20:59:31.142093 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.141993 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss6dz\" (UniqueName: \"kubernetes.io/projected/d6feb217-d914-46ec-b7e1-242a7488d89a-kube-api-access-ss6dz\") pod \"d6feb217-d914-46ec-b7e1-242a7488d89a\" (UID: \"d6feb217-d914-46ec-b7e1-242a7488d89a\") " Apr 17 20:59:31.144326 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.144297 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6feb217-d914-46ec-b7e1-242a7488d89a-kube-api-access-ss6dz" (OuterVolumeSpecName: "kube-api-access-ss6dz") pod "d6feb217-d914-46ec-b7e1-242a7488d89a" (UID: "d6feb217-d914-46ec-b7e1-242a7488d89a"). InnerVolumeSpecName "kube-api-access-ss6dz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:59:31.242757 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.242723 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss6dz\" (UniqueName: \"kubernetes.io/projected/d6feb217-d914-46ec-b7e1-242a7488d89a-kube-api-access-ss6dz\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 20:59:31.769100 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.769067 2576 generic.go:358] "Generic (PLEG): container finished" podID="d6feb217-d914-46ec-b7e1-242a7488d89a" containerID="0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a" exitCode=0 Apr 17 20:59:31.769551 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.769126 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" Apr 17 20:59:31.769551 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.769154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" event={"ID":"d6feb217-d914-46ec-b7e1-242a7488d89a","Type":"ContainerDied","Data":"0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a"} Apr 17 20:59:31.769551 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.769194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-z8rk5" event={"ID":"d6feb217-d914-46ec-b7e1-242a7488d89a","Type":"ContainerDied","Data":"14b74f7e89383425f932c459bcf85af7eff6e7e75af507419c63b6f0b77d5c52"} Apr 17 20:59:31.769551 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.769211 2576 scope.go:117] "RemoveContainer" containerID="0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a" Apr 17 20:59:31.778750 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.778734 2576 scope.go:117] "RemoveContainer" containerID="0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a" Apr 17 20:59:31.779031 ip-10-0-130-66 kubenswrapper[2576]: E0417 20:59:31.778993 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a\": container with ID starting with 0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a not found: ID does not exist" containerID="0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a" Apr 17 20:59:31.779121 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.779038 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a"} err="failed to get container status \"0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a\": rpc error: code = NotFound desc = could not find container \"0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a\": container with ID starting with 0d575d1d818e1d4b611c71fa9034e2bb1c9ed375361c28cfe4388314e8735f2a not found: ID does not exist" Apr 17 20:59:31.788109 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.788086 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z8rk5"] Apr 17 20:59:31.791181 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:31.791162 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z8rk5"] Apr 17 20:59:32.239924 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:32.239884 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6feb217-d914-46ec-b7e1-242a7488d89a" path="/var/lib/kubelet/pods/d6feb217-d914-46ec-b7e1-242a7488d89a/volumes" Apr 17 20:59:38.375705 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:38.375663 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 20:59:38.376112 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:38.375716 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 20:59:38.397140 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:38.397106 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 20:59:38.820395 ip-10-0-130-66 kubenswrapper[2576]: I0417 20:59:38.820349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-hqrnw" Apr 17 21:00:02.639665 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.639628 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-vjkk6"] Apr 17 21:00:02.640084 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.639882 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6feb217-d914-46ec-b7e1-242a7488d89a" containerName="registry-server" Apr 17 21:00:02.640084 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.639893 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6feb217-d914-46ec-b7e1-242a7488d89a" containerName="registry-server" Apr 17 21:00:02.640084 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.639948 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6feb217-d914-46ec-b7e1-242a7488d89a" containerName="registry-server" Apr 17 21:00:02.647754 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.647732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" Apr 17 21:00:02.651199 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.651168 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-bgq7k\"" Apr 17 21:00:02.652569 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.652547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-vjkk6"] Apr 17 21:00:02.788345 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.788293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks5sn\" (UniqueName: \"kubernetes.io/projected/cbb471c8-b495-4f9d-bb0c-73b1cedda753-kube-api-access-ks5sn\") pod \"authorino-operator-657f44b778-vjkk6\" (UID: \"cbb471c8-b495-4f9d-bb0c-73b1cedda753\") " pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" Apr 17 21:00:02.889789 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.889692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks5sn\" (UniqueName: \"kubernetes.io/projected/cbb471c8-b495-4f9d-bb0c-73b1cedda753-kube-api-access-ks5sn\") pod \"authorino-operator-657f44b778-vjkk6\" (UID: \"cbb471c8-b495-4f9d-bb0c-73b1cedda753\") " pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" Apr 17 21:00:02.898880 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.898853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks5sn\" (UniqueName: \"kubernetes.io/projected/cbb471c8-b495-4f9d-bb0c-73b1cedda753-kube-api-access-ks5sn\") pod \"authorino-operator-657f44b778-vjkk6\" (UID: \"cbb471c8-b495-4f9d-bb0c-73b1cedda753\") " pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" Apr 17 21:00:02.959163 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:02.959133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" Apr 17 21:00:03.080138 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:03.080114 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-vjkk6"] Apr 17 21:00:03.083066 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:00:03.083035 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb471c8_b495_4f9d_bb0c_73b1cedda753.slice/crio-5b3944d8dfcfb09835b32a8477f0611e04ecaadc469144798da19f6d7643b7f3 WatchSource:0}: Error finding container 5b3944d8dfcfb09835b32a8477f0611e04ecaadc469144798da19f6d7643b7f3: Status 404 returned error can't find the container with id 5b3944d8dfcfb09835b32a8477f0611e04ecaadc469144798da19f6d7643b7f3 Apr 17 21:00:03.871219 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:03.871180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" event={"ID":"cbb471c8-b495-4f9d-bb0c-73b1cedda753","Type":"ContainerStarted","Data":"5b3944d8dfcfb09835b32a8477f0611e04ecaadc469144798da19f6d7643b7f3"} Apr 17 21:00:05.434371 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.434331 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489"] Apr 17 21:00:05.437496 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.437473 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" Apr 17 21:00:05.441288 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.441268 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-bn6x2\"" Apr 17 21:00:05.441429 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.441316 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 21:00:05.454200 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.454179 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489"] Apr 17 21:00:05.612029 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.611997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w4bp\" (UniqueName: \"kubernetes.io/projected/68c11c08-7d9f-4a99-b37b-3653753db8d2-kube-api-access-8w4bp\") pod \"dns-operator-controller-manager-648d5c98bc-tf489\" (UID: \"68c11c08-7d9f-4a99-b37b-3653753db8d2\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" Apr 17 21:00:05.713449 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.713417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w4bp\" (UniqueName: \"kubernetes.io/projected/68c11c08-7d9f-4a99-b37b-3653753db8d2-kube-api-access-8w4bp\") pod \"dns-operator-controller-manager-648d5c98bc-tf489\" (UID: \"68c11c08-7d9f-4a99-b37b-3653753db8d2\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" Apr 17 21:00:05.733427 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.733393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w4bp\" (UniqueName: \"kubernetes.io/projected/68c11c08-7d9f-4a99-b37b-3653753db8d2-kube-api-access-8w4bp\") pod \"dns-operator-controller-manager-648d5c98bc-tf489\" (UID: \"68c11c08-7d9f-4a99-b37b-3653753db8d2\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" Apr 17 21:00:05.747226 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.747199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" Apr 17 21:00:05.880427 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.880386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" event={"ID":"cbb471c8-b495-4f9d-bb0c-73b1cedda753","Type":"ContainerStarted","Data":"aea151d26d6ef9a1d9bccd6d585e02c1c0facc41f8ca6a4aa546f67e16a0ce1d"} Apr 17 21:00:05.880608 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.880461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" Apr 17 21:00:05.885192 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.885164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489"] Apr 17 21:00:05.888012 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:00:05.887986 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c11c08_7d9f_4a99_b37b_3653753db8d2.slice/crio-b6b6b48005ce4dc5cd4e9ae7355c9c323768b05e479e35e790360c48602cce3a WatchSource:0}: Error finding container b6b6b48005ce4dc5cd4e9ae7355c9c323768b05e479e35e790360c48602cce3a: Status 404 returned error can't find the container with id b6b6b48005ce4dc5cd4e9ae7355c9c323768b05e479e35e790360c48602cce3a Apr 17 21:00:05.900253 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:05.900209 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" podStartSLOduration=1.621270427 podStartE2EDuration="3.90019604s" podCreationTimestamp="2026-04-17 21:00:02 +0000 UTC" firstStartedPulling="2026-04-17 21:00:03.085245207 +0000 UTC m=+587.319711593" lastFinishedPulling="2026-04-17 21:00:05.364170821 +0000 UTC m=+589.598637206" observedRunningTime="2026-04-17 21:00:05.899674336 +0000 UTC m=+590.134140742" watchObservedRunningTime="2026-04-17 21:00:05.90019604 +0000 UTC m=+590.134662460" Apr 17 21:00:06.887925 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:06.887893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" event={"ID":"68c11c08-7d9f-4a99-b37b-3653753db8d2","Type":"ContainerStarted","Data":"b6b6b48005ce4dc5cd4e9ae7355c9c323768b05e479e35e790360c48602cce3a"} Apr 17 21:00:07.743926 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:07.743888 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms"] Apr 17 21:00:07.747545 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:07.747521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:07.749908 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:07.749886 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-8k2hr\"" Apr 17 21:00:07.756336 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:07.756165 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms"] Apr 17 21:00:07.830172 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:07.830134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwc45\" (UniqueName: \"kubernetes.io/projected/896017fb-3791-4ae5-b73c-75ac72e132cc-kube-api-access-qwc45\") pod \"limitador-operator-controller-manager-85c4996f8c-rzrms\" (UID: \"896017fb-3791-4ae5-b73c-75ac72e132cc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:07.930764 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:07.930711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwc45\" (UniqueName: \"kubernetes.io/projected/896017fb-3791-4ae5-b73c-75ac72e132cc-kube-api-access-qwc45\") pod \"limitador-operator-controller-manager-85c4996f8c-rzrms\" (UID: \"896017fb-3791-4ae5-b73c-75ac72e132cc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:07.943749 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:07.943724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwc45\" (UniqueName: \"kubernetes.io/projected/896017fb-3791-4ae5-b73c-75ac72e132cc-kube-api-access-qwc45\") pod \"limitador-operator-controller-manager-85c4996f8c-rzrms\" (UID: \"896017fb-3791-4ae5-b73c-75ac72e132cc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:08.059559 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:08.059469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:08.208820 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:08.196196 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms"] Apr 17 21:00:08.897905 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:08.897868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" event={"ID":"896017fb-3791-4ae5-b73c-75ac72e132cc","Type":"ContainerStarted","Data":"90b707ae6e2a94e0d2a180ec4b93d2d3aaa7599f4ec965331d5a9e65a695a6c1"} Apr 17 21:00:08.899964 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:08.899934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" event={"ID":"68c11c08-7d9f-4a99-b37b-3653753db8d2","Type":"ContainerStarted","Data":"f44a6b979380f22519ea695807c4025510a240656f61c2c2e31834d99ccba9eb"} Apr 17 21:00:08.900142 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:08.900127 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" Apr 17 21:00:08.921260 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:08.921200 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" podStartSLOduration=1.694772908 podStartE2EDuration="3.921180287s" podCreationTimestamp="2026-04-17 21:00:05 +0000 UTC" firstStartedPulling="2026-04-17 21:00:05.889803174 +0000 UTC m=+590.124269558" lastFinishedPulling="2026-04-17 21:00:08.116210537 +0000 UTC m=+592.350676937" observedRunningTime="2026-04-17 21:00:08.915332977 +0000 UTC m=+593.149799374" watchObservedRunningTime="2026-04-17 21:00:08.921180287 +0000 UTC m=+593.155646693" Apr 17 21:00:09.905041 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:09.904951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" event={"ID":"896017fb-3791-4ae5-b73c-75ac72e132cc","Type":"ContainerStarted","Data":"56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b"} Apr 17 21:00:09.905041 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:09.905015 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:09.919978 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:09.919896 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" podStartSLOduration=1.615123392 podStartE2EDuration="2.919878386s" podCreationTimestamp="2026-04-17 21:00:07 +0000 UTC" firstStartedPulling="2026-04-17 21:00:08.221080126 +0000 UTC m=+592.455546511" lastFinishedPulling="2026-04-17 21:00:09.525835107 +0000 UTC m=+593.760301505" observedRunningTime="2026-04-17 21:00:09.918704131 +0000 UTC m=+594.153170516" watchObservedRunningTime="2026-04-17 21:00:09.919878386 +0000 UTC m=+594.154344793" Apr 17 21:00:16.197890 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:16.197859 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 21:00:16.198309 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:16.198290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 21:00:16.890182 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:16.890153 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-vjkk6" Apr 17 21:00:18.326087 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.326053 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms"] Apr 17 21:00:18.326542 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.326371 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" containerName="manager" containerID="cri-o://56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b" gracePeriod=2 Apr 17 21:00:18.328267 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.328240 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:18.339032 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.339002 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms"] Apr 17 21:00:18.348644 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.348613 2576 status_manager.go:895] "Failed to get status for pod" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" err="pods \"limitador-operator-controller-manager-85c4996f8c-rzrms\" is forbidden: User \"system:node:ip-10-0-130-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-66.ec2.internal' and this object" Apr 17 21:00:18.361179 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.361153 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd"] Apr 17 21:00:18.361520 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.361503 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" containerName="manager" Apr 17 21:00:18.361566 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.361522 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" containerName="manager" Apr 17 21:00:18.361603 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.361571 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" containerName="manager" Apr 17 21:00:18.364252 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.364230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" Apr 17 21:00:18.366328 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.366304 2576 status_manager.go:895] "Failed to get status for pod" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" err="pods \"limitador-operator-controller-manager-85c4996f8c-rzrms\" is forbidden: User \"system:node:ip-10-0-130-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-66.ec2.internal' and this object" Apr 17 21:00:18.378973 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.378947 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd"] Apr 17 21:00:18.414992 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.414964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpn2b\" (UniqueName: \"kubernetes.io/projected/7a2d4136-2b7f-4bc2-b986-b2baf1642332-kube-api-access-bpn2b\") pod \"limitador-operator-controller-manager-85c4996f8c-hntcd\" (UID: \"7a2d4136-2b7f-4bc2-b986-b2baf1642332\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" Apr 17 21:00:18.515999 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.515967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpn2b\" (UniqueName: \"kubernetes.io/projected/7a2d4136-2b7f-4bc2-b986-b2baf1642332-kube-api-access-bpn2b\") pod \"limitador-operator-controller-manager-85c4996f8c-hntcd\" (UID: \"7a2d4136-2b7f-4bc2-b986-b2baf1642332\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" Apr 17 21:00:18.523783 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.523752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpn2b\" (UniqueName: \"kubernetes.io/projected/7a2d4136-2b7f-4bc2-b986-b2baf1642332-kube-api-access-bpn2b\") pod \"limitador-operator-controller-manager-85c4996f8c-hntcd\" (UID: \"7a2d4136-2b7f-4bc2-b986-b2baf1642332\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" Apr 17 21:00:18.551050 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.551028 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:18.552933 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.552909 2576 status_manager.go:895] "Failed to get status for pod" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" err="pods \"limitador-operator-controller-manager-85c4996f8c-rzrms\" is forbidden: User \"system:node:ip-10-0-130-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-66.ec2.internal' and this object" Apr 17 21:00:18.617296 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.617226 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwc45\" (UniqueName: \"kubernetes.io/projected/896017fb-3791-4ae5-b73c-75ac72e132cc-kube-api-access-qwc45\") pod \"896017fb-3791-4ae5-b73c-75ac72e132cc\" (UID: \"896017fb-3791-4ae5-b73c-75ac72e132cc\") " Apr 17 21:00:18.619454 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.619419 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896017fb-3791-4ae5-b73c-75ac72e132cc-kube-api-access-qwc45" (OuterVolumeSpecName: "kube-api-access-qwc45") pod "896017fb-3791-4ae5-b73c-75ac72e132cc" (UID: "896017fb-3791-4ae5-b73c-75ac72e132cc"). InnerVolumeSpecName "kube-api-access-qwc45". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:00:18.708597 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.708550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" Apr 17 21:00:18.718531 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.718506 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwc45\" (UniqueName: \"kubernetes.io/projected/896017fb-3791-4ae5-b73c-75ac72e132cc-kube-api-access-qwc45\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:00:18.847450 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.847426 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd"] Apr 17 21:00:18.851258 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:00:18.851227 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a2d4136_2b7f_4bc2_b986_b2baf1642332.slice/crio-761be48548b9a508ee84c01102b67d8ed56d7615b494f0028f8fe94984948e8a WatchSource:0}: Error finding container 761be48548b9a508ee84c01102b67d8ed56d7615b494f0028f8fe94984948e8a: Status 404 returned error can't find the container with id 761be48548b9a508ee84c01102b67d8ed56d7615b494f0028f8fe94984948e8a Apr 17 21:00:18.935941 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.935909 2576 generic.go:358] "Generic (PLEG): container finished" podID="896017fb-3791-4ae5-b73c-75ac72e132cc" containerID="56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b" exitCode=0 Apr 17 21:00:18.936068 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.935975 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" Apr 17 21:00:18.936068 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.936000 2576 scope.go:117] "RemoveContainer" containerID="56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b" Apr 17 21:00:18.937725 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.937698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" event={"ID":"7a2d4136-2b7f-4bc2-b986-b2baf1642332","Type":"ContainerStarted","Data":"700384dde0b69e99b3c129f2ec1734e745252012d33ef97e3d7a181a40d5c4e1"} Apr 17 21:00:18.937830 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.937736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" event={"ID":"7a2d4136-2b7f-4bc2-b986-b2baf1642332","Type":"ContainerStarted","Data":"761be48548b9a508ee84c01102b67d8ed56d7615b494f0028f8fe94984948e8a"} Apr 17 21:00:18.937830 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.937782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" Apr 17 21:00:18.938186 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.938159 2576 status_manager.go:895] "Failed to get status for pod" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" err="pods \"limitador-operator-controller-manager-85c4996f8c-rzrms\" is forbidden: User \"system:node:ip-10-0-130-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-66.ec2.internal' and this object" Apr 17 21:00:18.939991 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.939963 2576 status_manager.go:895] "Failed to get status for pod" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" err="pods \"limitador-operator-controller-manager-85c4996f8c-rzrms\" is forbidden: User \"system:node:ip-10-0-130-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-66.ec2.internal' and this object" Apr 17 21:00:18.944287 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.944266 2576 scope.go:117] "RemoveContainer" containerID="56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b" Apr 17 21:00:18.944563 ip-10-0-130-66 kubenswrapper[2576]: E0417 21:00:18.944546 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b\": container with ID starting with 56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b not found: ID does not exist" containerID="56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b" Apr 17 21:00:18.944629 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.944571 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b"} err="failed to get container status \"56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b\": rpc error: code = NotFound desc = could not find container \"56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b\": container with ID starting with 56a14bc8e5d2cfa7562345c6007292ca66516fb7f8adbebb2267f84dbd00da1b not found: ID does not exist" Apr 17 21:00:18.955831 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.955790 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" podStartSLOduration=0.955779704 podStartE2EDuration="955.779704ms" podCreationTimestamp="2026-04-17 21:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:00:18.954514045 +0000 UTC m=+603.188980467" watchObservedRunningTime="2026-04-17 21:00:18.955779704 +0000 UTC m=+603.190246111" Apr 17 21:00:18.956310 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:18.956292 2576 status_manager.go:895] "Failed to get status for pod" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" err="pods \"limitador-operator-controller-manager-85c4996f8c-rzrms\" is forbidden: User \"system:node:ip-10-0-130-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-66.ec2.internal' and this object" Apr 17 21:00:19.907380 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:19.907328 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tf489" Apr 17 21:00:19.923639 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:19.923600 2576 status_manager.go:895] "Failed to get status for pod" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rzrms" err="pods \"limitador-operator-controller-manager-85c4996f8c-rzrms\" is forbidden: User \"system:node:ip-10-0-130-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-66.ec2.internal' and this object" Apr 17 21:00:20.239724 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:20.239687 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="896017fb-3791-4ae5-b73c-75ac72e132cc" path="/var/lib/kubelet/pods/896017fb-3791-4ae5-b73c-75ac72e132cc/volumes" Apr 17 21:00:29.943747 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:29.943643 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hntcd" Apr 17 21:00:51.960417 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:51.960383 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wfkss"] Apr 17 21:00:51.965418 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:51.965389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" Apr 17 21:00:51.967726 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:51.967699 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nx2nz\"" Apr 17 21:00:51.970317 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:51.970294 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wfkss"] Apr 17 21:00:52.085785 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:52.085737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsllc\" (UniqueName: \"kubernetes.io/projected/bf94fe60-b362-4f0f-8eb6-d9a51d218549-kube-api-access-bsllc\") pod \"authorino-f99f4b5cd-wfkss\" (UID: \"bf94fe60-b362-4f0f-8eb6-d9a51d218549\") " pod="kuadrant-system/authorino-f99f4b5cd-wfkss" Apr 17 21:00:52.186448 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:52.186409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsllc\" (UniqueName: \"kubernetes.io/projected/bf94fe60-b362-4f0f-8eb6-d9a51d218549-kube-api-access-bsllc\") pod \"authorino-f99f4b5cd-wfkss\" (UID: \"bf94fe60-b362-4f0f-8eb6-d9a51d218549\") " pod="kuadrant-system/authorino-f99f4b5cd-wfkss" Apr 17 21:00:52.194242 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:52.194208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsllc\" (UniqueName: \"kubernetes.io/projected/bf94fe60-b362-4f0f-8eb6-d9a51d218549-kube-api-access-bsllc\") pod \"authorino-f99f4b5cd-wfkss\" (UID: \"bf94fe60-b362-4f0f-8eb6-d9a51d218549\") " pod="kuadrant-system/authorino-f99f4b5cd-wfkss" Apr 17 21:00:52.276994 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:52.276909 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" Apr 17 21:00:52.394413 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:52.394346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wfkss"] Apr 17 21:00:52.397851 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:00:52.397818 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf94fe60_b362_4f0f_8eb6_d9a51d218549.slice/crio-b548c597f2206e9c23bb24dc1e8563cefdea77845564b6b453eafd627a6469b1 WatchSource:0}: Error finding container b548c597f2206e9c23bb24dc1e8563cefdea77845564b6b453eafd627a6469b1: Status 404 returned error can't find the container with id b548c597f2206e9c23bb24dc1e8563cefdea77845564b6b453eafd627a6469b1 Apr 17 21:00:53.046058 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:53.046005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" event={"ID":"bf94fe60-b362-4f0f-8eb6-d9a51d218549","Type":"ContainerStarted","Data":"b548c597f2206e9c23bb24dc1e8563cefdea77845564b6b453eafd627a6469b1"} Apr 17 21:00:55.055517 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:55.055472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" event={"ID":"bf94fe60-b362-4f0f-8eb6-d9a51d218549","Type":"ContainerStarted","Data":"15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f"} Apr 17 21:00:55.069400 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:55.069330 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" podStartSLOduration=2.048222316 podStartE2EDuration="4.069316399s" podCreationTimestamp="2026-04-17 21:00:51 +0000 UTC" firstStartedPulling="2026-04-17 21:00:52.399134876 +0000 UTC m=+636.633601275" lastFinishedPulling="2026-04-17 21:00:54.420228965 +0000 UTC m=+638.654695358" observedRunningTime="2026-04-17 21:00:55.067884612 +0000 UTC m=+639.302351031" watchObservedRunningTime="2026-04-17 21:00:55.069316399 +0000 UTC m=+639.303782823" Apr 17 21:00:57.047779 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:57.047744 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wfkss"] Apr 17 21:00:57.062677 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:57.062640 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" podUID="bf94fe60-b362-4f0f-8eb6-d9a51d218549" containerName="authorino" containerID="cri-o://15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f" gracePeriod=30 Apr 17 21:00:57.303940 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:57.303876 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" Apr 17 21:00:57.429541 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:57.429509 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsllc\" (UniqueName: \"kubernetes.io/projected/bf94fe60-b362-4f0f-8eb6-d9a51d218549-kube-api-access-bsllc\") pod \"bf94fe60-b362-4f0f-8eb6-d9a51d218549\" (UID: \"bf94fe60-b362-4f0f-8eb6-d9a51d218549\") " Apr 17 21:00:57.431773 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:57.431744 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf94fe60-b362-4f0f-8eb6-d9a51d218549-kube-api-access-bsllc" (OuterVolumeSpecName: "kube-api-access-bsllc") pod "bf94fe60-b362-4f0f-8eb6-d9a51d218549" (UID: "bf94fe60-b362-4f0f-8eb6-d9a51d218549"). InnerVolumeSpecName "kube-api-access-bsllc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:00:57.530614 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:57.530583 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsllc\" (UniqueName: \"kubernetes.io/projected/bf94fe60-b362-4f0f-8eb6-d9a51d218549-kube-api-access-bsllc\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:00:58.067028 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.066995 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf94fe60-b362-4f0f-8eb6-d9a51d218549" containerID="15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f" exitCode=0 Apr 17 21:00:58.067492 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.067048 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" Apr 17 21:00:58.067492 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.067059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" event={"ID":"bf94fe60-b362-4f0f-8eb6-d9a51d218549","Type":"ContainerDied","Data":"15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f"} Apr 17 21:00:58.067492 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.067086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wfkss" event={"ID":"bf94fe60-b362-4f0f-8eb6-d9a51d218549","Type":"ContainerDied","Data":"b548c597f2206e9c23bb24dc1e8563cefdea77845564b6b453eafd627a6469b1"} Apr 17 21:00:58.067492 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.067101 2576 scope.go:117] "RemoveContainer" containerID="15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f" Apr 17 21:00:58.075567 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.075541 2576 scope.go:117] "RemoveContainer" containerID="15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f" Apr 17 21:00:58.075810 ip-10-0-130-66 kubenswrapper[2576]: E0417 21:00:58.075790 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f\": container with ID starting with 15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f not found: ID does not exist" containerID="15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f" Apr 17 21:00:58.075880 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.075823 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f"} err="failed to get container status \"15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f\": rpc error: code = NotFound desc = could not find container \"15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f\": container with ID starting with 15ed9b7da023cb93ebd7e4e69611ed03d5afbc1fa2c6cf619fa59db2840d710f not found: ID does not exist" Apr 17 21:00:58.086531 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.086503 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wfkss"] Apr 17 21:00:58.092108 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.092090 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wfkss"] Apr 17 21:00:58.240477 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:00:58.240429 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf94fe60-b362-4f0f-8eb6-d9a51d218549" path="/var/lib/kubelet/pods/bf94fe60-b362-4f0f-8eb6-d9a51d218549/volumes" Apr 17 21:01:26.637781 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.637744 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-fzdgd"] Apr 17 21:01:26.638222 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.638036 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf94fe60-b362-4f0f-8eb6-d9a51d218549" containerName="authorino" Apr 17 21:01:26.638222 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.638047 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf94fe60-b362-4f0f-8eb6-d9a51d218549" containerName="authorino" Apr 17 21:01:26.638222 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.638094 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf94fe60-b362-4f0f-8eb6-d9a51d218549" containerName="authorino" Apr 17 21:01:26.640298 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.640280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" Apr 17 21:01:26.642393 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.642348 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nx2nz\"" Apr 17 21:01:26.646400 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.646374 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-fzdgd"] Apr 17 21:01:26.652366 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.652332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-789h8\" (UniqueName: \"kubernetes.io/projected/c5f06e93-b1c4-4043-83b2-d571a03e0f61-kube-api-access-789h8\") pod \"authorino-8b475cf9f-fzdgd\" (UID: \"c5f06e93-b1c4-4043-83b2-d571a03e0f61\") " pod="kuadrant-system/authorino-8b475cf9f-fzdgd" Apr 17 21:01:26.753131 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.753099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-789h8\" (UniqueName: \"kubernetes.io/projected/c5f06e93-b1c4-4043-83b2-d571a03e0f61-kube-api-access-789h8\") pod \"authorino-8b475cf9f-fzdgd\" (UID: \"c5f06e93-b1c4-4043-83b2-d571a03e0f61\") " pod="kuadrant-system/authorino-8b475cf9f-fzdgd" Apr 17 21:01:26.761045 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.761014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-789h8\" (UniqueName: \"kubernetes.io/projected/c5f06e93-b1c4-4043-83b2-d571a03e0f61-kube-api-access-789h8\") pod \"authorino-8b475cf9f-fzdgd\" (UID: \"c5f06e93-b1c4-4043-83b2-d571a03e0f61\") " pod="kuadrant-system/authorino-8b475cf9f-fzdgd" Apr 17 21:01:26.860988 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.860950 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-fzdgd"] Apr 17 21:01:26.861164 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.861152 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" Apr 17 21:01:26.886140 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.886108 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-77686d5cfb-7dwf6"] Apr 17 21:01:26.888624 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.888546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-77686d5cfb-7dwf6" Apr 17 21:01:26.901349 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.897403 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-77686d5cfb-7dwf6"] Apr 17 21:01:26.954175 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.954135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzx7\" (UniqueName: \"kubernetes.io/projected/8c213aba-8a1f-4ab9-8c67-d493bc1e552b-kube-api-access-wvzx7\") pod \"authorino-77686d5cfb-7dwf6\" (UID: \"8c213aba-8a1f-4ab9-8c67-d493bc1e552b\") " pod="kuadrant-system/authorino-77686d5cfb-7dwf6" Apr 17 21:01:26.984620 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:26.984585 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-fzdgd"] Apr 17 21:01:26.987675 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:01:26.987648 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5f06e93_b1c4_4043_83b2_d571a03e0f61.slice/crio-2e497f7c0a6171ee8ec637566c7a1bb7b6ded0ea82d603ee7cf2ef6bfddf98b1 WatchSource:0}: Error finding container 2e497f7c0a6171ee8ec637566c7a1bb7b6ded0ea82d603ee7cf2ef6bfddf98b1: Status 404 returned error can't find the container with id 2e497f7c0a6171ee8ec637566c7a1bb7b6ded0ea82d603ee7cf2ef6bfddf98b1 Apr 17 21:01:27.008022 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.007996 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-77686d5cfb-7dwf6"] Apr 17 21:01:27.008223 ip-10-0-130-66 kubenswrapper[2576]: E0417 21:01:27.008203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-wvzx7], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-77686d5cfb-7dwf6" podUID="8c213aba-8a1f-4ab9-8c67-d493bc1e552b" Apr 17 21:01:27.054625 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.054594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzx7\" (UniqueName: \"kubernetes.io/projected/8c213aba-8a1f-4ab9-8c67-d493bc1e552b-kube-api-access-wvzx7\") pod \"authorino-77686d5cfb-7dwf6\" (UID: \"8c213aba-8a1f-4ab9-8c67-d493bc1e552b\") " pod="kuadrant-system/authorino-77686d5cfb-7dwf6" Apr 17 21:01:27.062275 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.062240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzx7\" (UniqueName: \"kubernetes.io/projected/8c213aba-8a1f-4ab9-8c67-d493bc1e552b-kube-api-access-wvzx7\") pod \"authorino-77686d5cfb-7dwf6\" (UID: \"8c213aba-8a1f-4ab9-8c67-d493bc1e552b\") " pod="kuadrant-system/authorino-77686d5cfb-7dwf6" Apr 17 21:01:27.159103 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.159028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-77686d5cfb-7dwf6" Apr 17 21:01:27.159250 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.159026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" event={"ID":"c5f06e93-b1c4-4043-83b2-d571a03e0f61","Type":"ContainerStarted","Data":"2e497f7c0a6171ee8ec637566c7a1bb7b6ded0ea82d603ee7cf2ef6bfddf98b1"} Apr 17 21:01:27.163723 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.163701 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-77686d5cfb-7dwf6" Apr 17 21:01:27.256411 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.256379 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvzx7\" (UniqueName: \"kubernetes.io/projected/8c213aba-8a1f-4ab9-8c67-d493bc1e552b-kube-api-access-wvzx7\") pod \"8c213aba-8a1f-4ab9-8c67-d493bc1e552b\" (UID: \"8c213aba-8a1f-4ab9-8c67-d493bc1e552b\") " Apr 17 21:01:27.258573 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.258544 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c213aba-8a1f-4ab9-8c67-d493bc1e552b-kube-api-access-wvzx7" (OuterVolumeSpecName: "kube-api-access-wvzx7") pod "8c213aba-8a1f-4ab9-8c67-d493bc1e552b" (UID: "8c213aba-8a1f-4ab9-8c67-d493bc1e552b"). InnerVolumeSpecName "kube-api-access-wvzx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:01:27.356885 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:27.356858 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvzx7\" (UniqueName: \"kubernetes.io/projected/8c213aba-8a1f-4ab9-8c67-d493bc1e552b-kube-api-access-wvzx7\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:01:28.163819 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.163718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" event={"ID":"c5f06e93-b1c4-4043-83b2-d571a03e0f61","Type":"ContainerStarted","Data":"ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708"} Apr 17 21:01:28.163819 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.163749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-77686d5cfb-7dwf6" Apr 17 21:01:28.164519 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.163826 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" podUID="c5f06e93-b1c4-4043-83b2-d571a03e0f61" containerName="authorino" containerID="cri-o://ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708" gracePeriod=30 Apr 17 21:01:28.185452 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.185398 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" podStartSLOduration=1.871585491 podStartE2EDuration="2.185378932s" podCreationTimestamp="2026-04-17 21:01:26 +0000 UTC" firstStartedPulling="2026-04-17 21:01:26.989590643 +0000 UTC m=+671.224057031" lastFinishedPulling="2026-04-17 21:01:27.303384086 +0000 UTC m=+671.537850472" observedRunningTime="2026-04-17 21:01:28.184943427 +0000 UTC m=+672.419409835" watchObservedRunningTime="2026-04-17 21:01:28.185378932 +0000 UTC m=+672.419845340" Apr 17 21:01:28.240632 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.240604 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-77686d5cfb-7dwf6"] Apr 17 21:01:28.243250 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.243227 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-77686d5cfb-7dwf6"] Apr 17 21:01:28.406716 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.406678 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" Apr 17 21:01:28.467693 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.467665 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-789h8\" (UniqueName: \"kubernetes.io/projected/c5f06e93-b1c4-4043-83b2-d571a03e0f61-kube-api-access-789h8\") pod \"c5f06e93-b1c4-4043-83b2-d571a03e0f61\" (UID: \"c5f06e93-b1c4-4043-83b2-d571a03e0f61\") " Apr 17 21:01:28.469925 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.469896 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f06e93-b1c4-4043-83b2-d571a03e0f61-kube-api-access-789h8" (OuterVolumeSpecName: "kube-api-access-789h8") pod "c5f06e93-b1c4-4043-83b2-d571a03e0f61" (UID: "c5f06e93-b1c4-4043-83b2-d571a03e0f61"). InnerVolumeSpecName "kube-api-access-789h8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:01:28.568502 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.568467 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-789h8\" (UniqueName: \"kubernetes.io/projected/c5f06e93-b1c4-4043-83b2-d571a03e0f61-kube-api-access-789h8\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:01:28.887086 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.887013 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g6lcl"] Apr 17 21:01:28.887335 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.887322 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5f06e93-b1c4-4043-83b2-d571a03e0f61" containerName="authorino" Apr 17 21:01:28.887403 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.887336 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f06e93-b1c4-4043-83b2-d571a03e0f61" containerName="authorino" Apr 17 21:01:28.887451 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.887428 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5f06e93-b1c4-4043-83b2-d571a03e0f61" containerName="authorino" Apr 17 21:01:28.889410 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.889386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" Apr 17 21:01:28.891817 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.891799 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-dgjhj\"" Apr 17 21:01:28.898163 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.897816 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g6lcl"] Apr 17 21:01:28.973010 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:28.972970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtmx\" (UniqueName: \"kubernetes.io/projected/67db86f4-5a55-48c9-b549-28a323c9f9f1-kube-api-access-zgtmx\") pod \"maas-controller-6d4c8f55f9-g6lcl\" (UID: \"67db86f4-5a55-48c9-b549-28a323c9f9f1\") " pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" Apr 17 21:01:29.048799 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.048764 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-c44cf996f-869tp"] Apr 17 21:01:29.050704 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.050675 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:29.060436 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.060415 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-c44cf996f-869tp"] Apr 17 21:01:29.073765 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.073736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtmx\" (UniqueName: \"kubernetes.io/projected/67db86f4-5a55-48c9-b549-28a323c9f9f1-kube-api-access-zgtmx\") pod \"maas-controller-6d4c8f55f9-g6lcl\" (UID: \"67db86f4-5a55-48c9-b549-28a323c9f9f1\") " pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" Apr 17 21:01:29.073910 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.073774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjb8n\" (UniqueName: \"kubernetes.io/projected/4052353b-43c9-4c5c-9ebc-2919b527041a-kube-api-access-pjb8n\") pod \"maas-controller-c44cf996f-869tp\" (UID: \"4052353b-43c9-4c5c-9ebc-2919b527041a\") " pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:29.081482 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.081462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtmx\" (UniqueName: \"kubernetes.io/projected/67db86f4-5a55-48c9-b549-28a323c9f9f1-kube-api-access-zgtmx\") pod \"maas-controller-6d4c8f55f9-g6lcl\" (UID: \"67db86f4-5a55-48c9-b549-28a323c9f9f1\") " pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" Apr 17 21:01:29.166400 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.166301 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g6lcl"] Apr 17 21:01:29.166765 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.166639 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" Apr 17 21:01:29.168594 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.168568 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5f06e93-b1c4-4043-83b2-d571a03e0f61" containerID="ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708" exitCode=0 Apr 17 21:01:29.168688 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.168614 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" Apr 17 21:01:29.168688 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.168629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" event={"ID":"c5f06e93-b1c4-4043-83b2-d571a03e0f61","Type":"ContainerDied","Data":"ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708"} Apr 17 21:01:29.168688 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.168662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-fzdgd" event={"ID":"c5f06e93-b1c4-4043-83b2-d571a03e0f61","Type":"ContainerDied","Data":"2e497f7c0a6171ee8ec637566c7a1bb7b6ded0ea82d603ee7cf2ef6bfddf98b1"} Apr 17 21:01:29.168688 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.168677 2576 scope.go:117] "RemoveContainer" containerID="ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708" Apr 17 21:01:29.174180 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.174155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjb8n\" (UniqueName: \"kubernetes.io/projected/4052353b-43c9-4c5c-9ebc-2919b527041a-kube-api-access-pjb8n\") pod \"maas-controller-c44cf996f-869tp\" (UID: \"4052353b-43c9-4c5c-9ebc-2919b527041a\") " pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:29.184852 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.184814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjb8n\" (UniqueName: \"kubernetes.io/projected/4052353b-43c9-4c5c-9ebc-2919b527041a-kube-api-access-pjb8n\") pod \"maas-controller-c44cf996f-869tp\" (UID: \"4052353b-43c9-4c5c-9ebc-2919b527041a\") " pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:29.192392 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.192322 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-545987446b-kbz79"] Apr 17 21:01:29.195282 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.195265 2576 scope.go:117] "RemoveContainer" containerID="ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708" Apr 17 21:01:29.195632 ip-10-0-130-66 kubenswrapper[2576]: E0417 21:01:29.195609 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708\": container with ID starting with ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708 not found: ID does not exist" containerID="ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708" Apr 17 21:01:29.195632 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.195622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:29.195754 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.195639 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708"} err="failed to get container status \"ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708\": rpc error: code = NotFound desc = could not find container \"ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708\": container with ID starting with ceeae21d3a7f03bd7ca29aae7db6f7a051036954666f4ede2001d8468f616708 not found: ID does not exist" Apr 17 21:01:29.201952 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.201463 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-545987446b-kbz79"] Apr 17 21:01:29.213017 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.210787 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-fzdgd"] Apr 17 21:01:29.213017 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.212974 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-fzdgd"] Apr 17 21:01:29.274882 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.274849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpkr8\" (UniqueName: \"kubernetes.io/projected/d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb-kube-api-access-xpkr8\") pod \"maas-controller-545987446b-kbz79\" (UID: \"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb\") " pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:29.306917 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.306884 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g6lcl"] Apr 17 21:01:29.309863 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:01:29.309835 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67db86f4_5a55_48c9_b549_28a323c9f9f1.slice/crio-9fe72bf3d829c4374e1da5328ee551dd16a3ac722823e0ff133ea66e271c8550 WatchSource:0}: Error finding container 9fe72bf3d829c4374e1da5328ee551dd16a3ac722823e0ff133ea66e271c8550: Status 404 returned error can't find the container with id 9fe72bf3d829c4374e1da5328ee551dd16a3ac722823e0ff133ea66e271c8550 Apr 17 21:01:29.360252 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.360219 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:29.376287 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.376261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpkr8\" (UniqueName: \"kubernetes.io/projected/d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb-kube-api-access-xpkr8\") pod \"maas-controller-545987446b-kbz79\" (UID: \"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb\") " pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:29.384089 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.384060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpkr8\" (UniqueName: \"kubernetes.io/projected/d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb-kube-api-access-xpkr8\") pod \"maas-controller-545987446b-kbz79\" (UID: \"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb\") " pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:29.479596 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.479563 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-c44cf996f-869tp"] Apr 17 21:01:29.482663 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:01:29.482613 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4052353b_43c9_4c5c_9ebc_2919b527041a.slice/crio-655ffc38d862c5b27e33d17f06f254f2adc819beb57dcc0d1ac605efe06236fd WatchSource:0}: Error finding container 655ffc38d862c5b27e33d17f06f254f2adc819beb57dcc0d1ac605efe06236fd: Status 404 returned error can't find the container with id 655ffc38d862c5b27e33d17f06f254f2adc819beb57dcc0d1ac605efe06236fd Apr 17 21:01:29.527628 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.527602 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:29.645498 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:29.645468 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-545987446b-kbz79"] Apr 17 21:01:29.648436 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:01:29.648405 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a15a1a_3fa5_4ef4_83c6_c5dfe061a0cb.slice/crio-f0413aed1090ea237a27a7fee438519497053bfd3d30b1c91b7e8d8edbb52630 WatchSource:0}: Error finding container f0413aed1090ea237a27a7fee438519497053bfd3d30b1c91b7e8d8edbb52630: Status 404 returned error can't find the container with id f0413aed1090ea237a27a7fee438519497053bfd3d30b1c91b7e8d8edbb52630 Apr 17 21:01:30.174842 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:30.174802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c44cf996f-869tp" event={"ID":"4052353b-43c9-4c5c-9ebc-2919b527041a","Type":"ContainerStarted","Data":"655ffc38d862c5b27e33d17f06f254f2adc819beb57dcc0d1ac605efe06236fd"} Apr 17 21:01:30.177874 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:30.177842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-545987446b-kbz79" event={"ID":"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb","Type":"ContainerStarted","Data":"f0413aed1090ea237a27a7fee438519497053bfd3d30b1c91b7e8d8edbb52630"} Apr 17 21:01:30.179142 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:30.179107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" event={"ID":"67db86f4-5a55-48c9-b549-28a323c9f9f1","Type":"ContainerStarted","Data":"9fe72bf3d829c4374e1da5328ee551dd16a3ac722823e0ff133ea66e271c8550"} Apr 17 21:01:30.244494 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:30.243187 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c213aba-8a1f-4ab9-8c67-d493bc1e552b" path="/var/lib/kubelet/pods/8c213aba-8a1f-4ab9-8c67-d493bc1e552b/volumes" Apr 17 21:01:30.244494 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:30.243520 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f06e93-b1c4-4043-83b2-d571a03e0f61" path="/var/lib/kubelet/pods/c5f06e93-b1c4-4043-83b2-d571a03e0f61/volumes" Apr 17 21:01:33.193776 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.193737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c44cf996f-869tp" event={"ID":"4052353b-43c9-4c5c-9ebc-2919b527041a","Type":"ContainerStarted","Data":"d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9"} Apr 17 21:01:33.194189 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.193868 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:33.195086 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.195057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-545987446b-kbz79" event={"ID":"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb","Type":"ContainerStarted","Data":"ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303"} Apr 17 21:01:33.195193 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.195174 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:33.196306 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.196284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" event={"ID":"67db86f4-5a55-48c9-b549-28a323c9f9f1","Type":"ContainerStarted","Data":"c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74"} Apr 17 21:01:33.196399 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.196333 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" Apr 17 21:01:33.196399 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.196338 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" podUID="67db86f4-5a55-48c9-b549-28a323c9f9f1" containerName="manager" containerID="cri-o://c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74" gracePeriod=10 Apr 17 21:01:33.209100 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.209064 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-c44cf996f-869tp" podStartSLOduration=1.21281878 podStartE2EDuration="4.20905271s" podCreationTimestamp="2026-04-17 21:01:29 +0000 UTC" firstStartedPulling="2026-04-17 21:01:29.483991788 +0000 UTC m=+673.718458173" lastFinishedPulling="2026-04-17 21:01:32.480225702 +0000 UTC m=+676.714692103" observedRunningTime="2026-04-17 21:01:33.207900362 +0000 UTC m=+677.442366787" watchObservedRunningTime="2026-04-17 21:01:33.20905271 +0000 UTC m=+677.443519118" Apr 17 21:01:33.220960 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.220906 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-545987446b-kbz79" podStartSLOduration=1.377704032 podStartE2EDuration="4.220891249s" podCreationTimestamp="2026-04-17 21:01:29 +0000 UTC" firstStartedPulling="2026-04-17 21:01:29.649772273 +0000 UTC m=+673.884238658" lastFinishedPulling="2026-04-17 21:01:32.492959474 +0000 UTC m=+676.727425875" observedRunningTime="2026-04-17 21:01:33.220539722 +0000 UTC m=+677.455006131" watchObservedRunningTime="2026-04-17 21:01:33.220891249 +0000 UTC m=+677.455357656" Apr 17 21:01:33.236222 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.236182 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" podStartSLOduration=2.067068746 podStartE2EDuration="5.236168921s" podCreationTimestamp="2026-04-17 21:01:28 +0000 UTC" firstStartedPulling="2026-04-17 21:01:29.31112719 +0000 UTC m=+673.545593575" lastFinishedPulling="2026-04-17 21:01:32.480227364 +0000 UTC m=+676.714693750" observedRunningTime="2026-04-17 21:01:33.234765193 +0000 UTC m=+677.469231601" watchObservedRunningTime="2026-04-17 21:01:33.236168921 +0000 UTC m=+677.470635327" Apr 17 21:01:33.426821 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.426800 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" Apr 17 21:01:33.513736 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.513641 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgtmx\" (UniqueName: \"kubernetes.io/projected/67db86f4-5a55-48c9-b549-28a323c9f9f1-kube-api-access-zgtmx\") pod \"67db86f4-5a55-48c9-b549-28a323c9f9f1\" (UID: \"67db86f4-5a55-48c9-b549-28a323c9f9f1\") " Apr 17 21:01:33.516007 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.515977 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67db86f4-5a55-48c9-b549-28a323c9f9f1-kube-api-access-zgtmx" (OuterVolumeSpecName: "kube-api-access-zgtmx") pod "67db86f4-5a55-48c9-b549-28a323c9f9f1" (UID: "67db86f4-5a55-48c9-b549-28a323c9f9f1"). InnerVolumeSpecName "kube-api-access-zgtmx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:01:33.614707 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:33.614668 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgtmx\" (UniqueName: \"kubernetes.io/projected/67db86f4-5a55-48c9-b549-28a323c9f9f1-kube-api-access-zgtmx\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:01:34.200618 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.200585 2576 generic.go:358] "Generic (PLEG): container finished" podID="67db86f4-5a55-48c9-b549-28a323c9f9f1" containerID="c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74" exitCode=0 Apr 17 21:01:34.201036 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.200672 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" Apr 17 21:01:34.201036 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.200672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" event={"ID":"67db86f4-5a55-48c9-b549-28a323c9f9f1","Type":"ContainerDied","Data":"c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74"} Apr 17 21:01:34.201036 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.200713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-g6lcl" event={"ID":"67db86f4-5a55-48c9-b549-28a323c9f9f1","Type":"ContainerDied","Data":"9fe72bf3d829c4374e1da5328ee551dd16a3ac722823e0ff133ea66e271c8550"} Apr 17 21:01:34.201036 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.200729 2576 scope.go:117] "RemoveContainer" containerID="c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74" Apr 17 21:01:34.210132 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.210112 2576 scope.go:117] "RemoveContainer" containerID="c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74" Apr 17 21:01:34.210404 ip-10-0-130-66 kubenswrapper[2576]: E0417 21:01:34.210381 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74\": container with ID starting with c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74 not found: ID does not exist" containerID="c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74" Apr 17 21:01:34.210478 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.210418 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74"} err="failed to get container status \"c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74\": rpc error: code = NotFound desc = could not find container \"c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74\": container with ID starting with c78910f3f2c689a2b202397f680da518b10d3cdb977fafc19980df54d652da74 not found: ID does not exist" Apr 17 21:01:34.221663 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.221639 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g6lcl"] Apr 17 21:01:34.225228 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.225205 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g6lcl"] Apr 17 21:01:34.240594 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:34.240567 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67db86f4-5a55-48c9-b549-28a323c9f9f1" path="/var/lib/kubelet/pods/67db86f4-5a55-48c9-b549-28a323c9f9f1/volumes" Apr 17 21:01:44.205494 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.205456 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:44.205892 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.205873 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:44.256949 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.256912 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-c44cf996f-869tp"] Apr 17 21:01:44.257149 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.257125 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-c44cf996f-869tp" podUID="4052353b-43c9-4c5c-9ebc-2919b527041a" containerName="manager" containerID="cri-o://d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9" gracePeriod=10 Apr 17 21:01:44.494155 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.494131 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:44.537543 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.537511 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d87747d66-j7w9t"] Apr 17 21:01:44.537808 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.537797 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4052353b-43c9-4c5c-9ebc-2919b527041a" containerName="manager" Apr 17 21:01:44.537846 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.537810 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4052353b-43c9-4c5c-9ebc-2919b527041a" containerName="manager" Apr 17 21:01:44.537846 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.537826 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67db86f4-5a55-48c9-b549-28a323c9f9f1" containerName="manager" Apr 17 21:01:44.537846 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.537831 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="67db86f4-5a55-48c9-b549-28a323c9f9f1" containerName="manager" Apr 17 21:01:44.537934 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.537875 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="67db86f4-5a55-48c9-b549-28a323c9f9f1" containerName="manager" Apr 17 21:01:44.537934 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.537883 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4052353b-43c9-4c5c-9ebc-2919b527041a" containerName="manager" Apr 17 21:01:44.539651 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.539629 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d87747d66-j7w9t" Apr 17 21:01:44.547573 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.547550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d87747d66-j7w9t"] Apr 17 21:01:44.621997 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.621959 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjb8n\" (UniqueName: \"kubernetes.io/projected/4052353b-43c9-4c5c-9ebc-2919b527041a-kube-api-access-pjb8n\") pod \"4052353b-43c9-4c5c-9ebc-2919b527041a\" (UID: \"4052353b-43c9-4c5c-9ebc-2919b527041a\") " Apr 17 21:01:44.622144 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.622105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6r74\" (UniqueName: \"kubernetes.io/projected/50ba2daf-7a9f-4259-8de3-72e124c3f4e1-kube-api-access-m6r74\") pod \"maas-controller-6d87747d66-j7w9t\" (UID: \"50ba2daf-7a9f-4259-8de3-72e124c3f4e1\") " pod="opendatahub/maas-controller-6d87747d66-j7w9t" Apr 17 21:01:44.624235 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.624212 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4052353b-43c9-4c5c-9ebc-2919b527041a-kube-api-access-pjb8n" (OuterVolumeSpecName: "kube-api-access-pjb8n") pod "4052353b-43c9-4c5c-9ebc-2919b527041a" (UID: "4052353b-43c9-4c5c-9ebc-2919b527041a"). InnerVolumeSpecName "kube-api-access-pjb8n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:01:44.723492 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.723456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6r74\" (UniqueName: \"kubernetes.io/projected/50ba2daf-7a9f-4259-8de3-72e124c3f4e1-kube-api-access-m6r74\") pod \"maas-controller-6d87747d66-j7w9t\" (UID: \"50ba2daf-7a9f-4259-8de3-72e124c3f4e1\") " pod="opendatahub/maas-controller-6d87747d66-j7w9t" Apr 17 21:01:44.723658 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.723518 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjb8n\" (UniqueName: \"kubernetes.io/projected/4052353b-43c9-4c5c-9ebc-2919b527041a-kube-api-access-pjb8n\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:01:44.731668 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.731615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6r74\" (UniqueName: \"kubernetes.io/projected/50ba2daf-7a9f-4259-8de3-72e124c3f4e1-kube-api-access-m6r74\") pod \"maas-controller-6d87747d66-j7w9t\" (UID: \"50ba2daf-7a9f-4259-8de3-72e124c3f4e1\") " pod="opendatahub/maas-controller-6d87747d66-j7w9t" Apr 17 21:01:44.850962 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.850924 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d87747d66-j7w9t" Apr 17 21:01:44.971932 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:44.971857 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d87747d66-j7w9t"] Apr 17 21:01:44.974690 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:01:44.974655 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50ba2daf_7a9f_4259_8de3_72e124c3f4e1.slice/crio-ee5e968e4d80ebc16dea5a55a268b406620a827203f74d2da9aafb203079de50 WatchSource:0}: Error finding container ee5e968e4d80ebc16dea5a55a268b406620a827203f74d2da9aafb203079de50: Status 404 returned error can't find the container with id ee5e968e4d80ebc16dea5a55a268b406620a827203f74d2da9aafb203079de50 Apr 17 21:01:45.241549 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.241512 2576 generic.go:358] "Generic (PLEG): container finished" podID="4052353b-43c9-4c5c-9ebc-2919b527041a" containerID="d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9" exitCode=0 Apr 17 21:01:45.241996 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.241560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c44cf996f-869tp" event={"ID":"4052353b-43c9-4c5c-9ebc-2919b527041a","Type":"ContainerDied","Data":"d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9"} Apr 17 21:01:45.241996 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.241590 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c44cf996f-869tp" Apr 17 21:01:45.241996 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.241608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c44cf996f-869tp" event={"ID":"4052353b-43c9-4c5c-9ebc-2919b527041a","Type":"ContainerDied","Data":"655ffc38d862c5b27e33d17f06f254f2adc819beb57dcc0d1ac605efe06236fd"} Apr 17 21:01:45.241996 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.241628 2576 scope.go:117] "RemoveContainer" containerID="d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9" Apr 17 21:01:45.243086 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.243056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d87747d66-j7w9t" event={"ID":"50ba2daf-7a9f-4259-8de3-72e124c3f4e1","Type":"ContainerStarted","Data":"ee5e968e4d80ebc16dea5a55a268b406620a827203f74d2da9aafb203079de50"} Apr 17 21:01:45.251433 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.251414 2576 scope.go:117] "RemoveContainer" containerID="d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9" Apr 17 21:01:45.251676 ip-10-0-130-66 kubenswrapper[2576]: E0417 21:01:45.251660 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9\": container with ID starting with d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9 not found: ID does not exist" containerID="d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9" Apr 17 21:01:45.251724 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.251684 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9"} err="failed to get container status \"d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9\": rpc error: code = NotFound desc = could not find container \"d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9\": container with ID starting with d8445c0ffde7db6b196411f87f952ba7d9f2fbf1c5219275214b6dff64b40ff9 not found: ID does not exist" Apr 17 21:01:45.262839 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.262815 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-c44cf996f-869tp"] Apr 17 21:01:45.266096 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:45.266074 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-c44cf996f-869tp"] Apr 17 21:01:46.242419 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:46.242388 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4052353b-43c9-4c5c-9ebc-2919b527041a" path="/var/lib/kubelet/pods/4052353b-43c9-4c5c-9ebc-2919b527041a/volumes" Apr 17 21:01:46.248018 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:46.247992 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d87747d66-j7w9t" event={"ID":"50ba2daf-7a9f-4259-8de3-72e124c3f4e1","Type":"ContainerStarted","Data":"79a8161dceb77c8b2cd33ea068170c452facdfe8e2f452200f5913463a1845ee"} Apr 17 21:01:46.248168 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:46.248156 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d87747d66-j7w9t" Apr 17 21:01:46.266479 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:46.266430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d87747d66-j7w9t" podStartSLOduration=1.9177459529999998 podStartE2EDuration="2.266415506s" podCreationTimestamp="2026-04-17 21:01:44 +0000 UTC" firstStartedPulling="2026-04-17 21:01:44.976059785 +0000 UTC m=+689.210526173" lastFinishedPulling="2026-04-17 21:01:45.32472934 +0000 UTC m=+689.559195726" observedRunningTime="2026-04-17 21:01:46.264509253 +0000 UTC m=+690.498975667" watchObservedRunningTime="2026-04-17 21:01:46.266415506 +0000 UTC m=+690.500881913" Apr 17 21:01:55.812215 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:55.812182 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-644f6c4f7d-9m5vf"] Apr 17 21:01:55.814486 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:55.814468 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:55.816925 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:55.816906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 21:01:55.817019 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:55.816930 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 21:01:55.822642 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:55.822615 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-644f6c4f7d-9m5vf"] Apr 17 21:01:55.912147 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:55.912102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vsrp\" (UniqueName: \"kubernetes.io/projected/cd002444-6404-4f4e-967e-bf82417df650-kube-api-access-8vsrp\") pod \"maas-api-644f6c4f7d-9m5vf\" (UID: \"cd002444-6404-4f4e-967e-bf82417df650\") " pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:55.912336 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:55.912224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cd002444-6404-4f4e-967e-bf82417df650-maas-api-tls\") pod \"maas-api-644f6c4f7d-9m5vf\" (UID: \"cd002444-6404-4f4e-967e-bf82417df650\") " pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:56.013075 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:56.013042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vsrp\" (UniqueName: \"kubernetes.io/projected/cd002444-6404-4f4e-967e-bf82417df650-kube-api-access-8vsrp\") pod \"maas-api-644f6c4f7d-9m5vf\" (UID: \"cd002444-6404-4f4e-967e-bf82417df650\") " pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:56.013259 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:56.013150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cd002444-6404-4f4e-967e-bf82417df650-maas-api-tls\") pod \"maas-api-644f6c4f7d-9m5vf\" (UID: \"cd002444-6404-4f4e-967e-bf82417df650\") " pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:56.016282 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:56.016253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cd002444-6404-4f4e-967e-bf82417df650-maas-api-tls\") pod \"maas-api-644f6c4f7d-9m5vf\" (UID: \"cd002444-6404-4f4e-967e-bf82417df650\") " pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:56.022856 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:56.022829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vsrp\" (UniqueName: \"kubernetes.io/projected/cd002444-6404-4f4e-967e-bf82417df650-kube-api-access-8vsrp\") pod \"maas-api-644f6c4f7d-9m5vf\" (UID: \"cd002444-6404-4f4e-967e-bf82417df650\") " pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:56.125654 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:56.125564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:56.287299 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:56.287265 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-644f6c4f7d-9m5vf"] Apr 17 21:01:56.290723 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:01:56.290678 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd002444_6404_4f4e_967e_bf82417df650.slice/crio-e8e9b6a6dd4bce2f9f597e352c5f2fab6c90617a030148baf2a6131119e2e9b8 WatchSource:0}: Error finding container e8e9b6a6dd4bce2f9f597e352c5f2fab6c90617a030148baf2a6131119e2e9b8: Status 404 returned error can't find the container with id e8e9b6a6dd4bce2f9f597e352c5f2fab6c90617a030148baf2a6131119e2e9b8 Apr 17 21:01:57.256147 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:57.256112 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d87747d66-j7w9t" Apr 17 21:01:57.284540 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:57.284487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" event={"ID":"cd002444-6404-4f4e-967e-bf82417df650","Type":"ContainerStarted","Data":"e8e9b6a6dd4bce2f9f597e352c5f2fab6c90617a030148baf2a6131119e2e9b8"} Apr 17 21:01:57.294967 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:57.294942 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-545987446b-kbz79"] Apr 17 21:01:57.295251 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:57.295220 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-545987446b-kbz79" podUID="d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb" containerName="manager" containerID="cri-o://ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303" gracePeriod=10 Apr 17 21:01:57.783085 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:57.783052 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:57.927070 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:57.927041 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpkr8\" (UniqueName: \"kubernetes.io/projected/d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb-kube-api-access-xpkr8\") pod \"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb\" (UID: \"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb\") " Apr 17 21:01:57.929226 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:57.929190 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb-kube-api-access-xpkr8" (OuterVolumeSpecName: "kube-api-access-xpkr8") pod "d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb" (UID: "d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb"). InnerVolumeSpecName "kube-api-access-xpkr8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:01:58.028476 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.028386 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpkr8\" (UniqueName: \"kubernetes.io/projected/d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb-kube-api-access-xpkr8\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:01:58.289548 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.289449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" event={"ID":"cd002444-6404-4f4e-967e-bf82417df650","Type":"ContainerStarted","Data":"ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558"} Apr 17 21:01:58.289964 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.289606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:01:58.290656 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.290635 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb" containerID="ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303" exitCode=0 Apr 17 21:01:58.290734 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.290697 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-545987446b-kbz79" Apr 17 21:01:58.290770 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.290726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-545987446b-kbz79" event={"ID":"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb","Type":"ContainerDied","Data":"ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303"} Apr 17 21:01:58.290770 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.290760 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-545987446b-kbz79" event={"ID":"d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb","Type":"ContainerDied","Data":"f0413aed1090ea237a27a7fee438519497053bfd3d30b1c91b7e8d8edbb52630"} Apr 17 21:01:58.290831 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.290777 2576 scope.go:117] "RemoveContainer" containerID="ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303" Apr 17 21:01:58.298961 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.298785 2576 scope.go:117] "RemoveContainer" containerID="ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303" Apr 17 21:01:58.299060 ip-10-0-130-66 kubenswrapper[2576]: E0417 21:01:58.299043 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303\": container with ID starting with ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303 not found: ID does not exist" containerID="ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303" Apr 17 21:01:58.299107 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.299069 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303"} err="failed to get container status \"ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303\": rpc error: code = NotFound desc = could not find container \"ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303\": container with ID starting with ddd63c44ff67a4872de6a0b0e40c790ed80979961b030dcea8bb3b66d64e2303 not found: ID does not exist" Apr 17 21:01:58.318414 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.318346 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" podStartSLOduration=1.8146989850000002 podStartE2EDuration="3.318333348s" podCreationTimestamp="2026-04-17 21:01:55 +0000 UTC" firstStartedPulling="2026-04-17 21:01:56.291880013 +0000 UTC m=+700.526346398" lastFinishedPulling="2026-04-17 21:01:57.795514362 +0000 UTC m=+702.029980761" observedRunningTime="2026-04-17 21:01:58.305882042 +0000 UTC m=+702.540348449" watchObservedRunningTime="2026-04-17 21:01:58.318333348 +0000 UTC m=+702.552799758" Apr 17 21:01:58.318864 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.318849 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-545987446b-kbz79"] Apr 17 21:01:58.322516 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:01:58.322497 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-545987446b-kbz79"] Apr 17 21:02:00.240231 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:00.240198 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb" path="/var/lib/kubelet/pods/d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb/volumes" Apr 17 21:02:04.300906 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:04.300877 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:02:20.975540 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.975504 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7"] Apr 17 21:02:20.975929 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.975821 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb" containerName="manager" Apr 17 21:02:20.975929 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.975832 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb" containerName="manager" Apr 17 21:02:20.975929 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.975884 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7a15a1a-3fa5-4ef4-83c6-c5dfe061a0cb" containerName="manager" Apr 17 21:02:20.980293 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.980271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:20.983803 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.983766 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 21:02:20.983968 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.983946 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 21:02:20.984068 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.983771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-lj54k\"" Apr 17 21:02:20.984156 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.983858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 21:02:20.987928 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:20.987904 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7"] Apr 17 21:02:21.000519 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.000498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.000628 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.000532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzq8w\" (UniqueName: \"kubernetes.io/projected/0f91c53f-d373-40bc-b118-8725d5241312-kube-api-access-kzq8w\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.000628 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.000557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.000698 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.000648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0f91c53f-d373-40bc-b118-8725d5241312-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.000748 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.000734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.000783 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.000770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101287 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101480 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101480 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101480 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzq8w\" (UniqueName: \"kubernetes.io/projected/0f91c53f-d373-40bc-b118-8725d5241312-kube-api-access-kzq8w\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101480 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101480 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0f91c53f-d373-40bc-b118-8725d5241312-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101730 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101730 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.101869 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.101852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.103729 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.103702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0f91c53f-d373-40bc-b118-8725d5241312-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.103987 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.103972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0f91c53f-d373-40bc-b118-8725d5241312-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.109383 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.109347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzq8w\" (UniqueName: \"kubernetes.io/projected/0f91c53f-d373-40bc-b118-8725d5241312-kube-api-access-kzq8w\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-68qz7\" (UID: \"0f91c53f-d373-40bc-b118-8725d5241312\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.293092 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.293005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:21.418175 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:21.418152 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7"] Apr 17 21:02:21.420320 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:02:21.420277 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f91c53f_d373_40bc_b118_8725d5241312.slice/crio-7204c891455f6db69f733a7b2a963d9eb440c8275cadbb6985fcd9781a51a2d4 WatchSource:0}: Error finding container 7204c891455f6db69f733a7b2a963d9eb440c8275cadbb6985fcd9781a51a2d4: Status 404 returned error can't find the container with id 7204c891455f6db69f733a7b2a963d9eb440c8275cadbb6985fcd9781a51a2d4 Apr 17 21:02:22.371023 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:22.370987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" event={"ID":"0f91c53f-d373-40bc-b118-8725d5241312","Type":"ContainerStarted","Data":"7204c891455f6db69f733a7b2a963d9eb440c8275cadbb6985fcd9781a51a2d4"} Apr 17 21:02:24.345463 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:24.345426 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-644f6c4f7d-9m5vf"] Apr 17 21:02:24.345912 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:24.345706 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" podUID="cd002444-6404-4f4e-967e-bf82417df650" containerName="maas-api" containerID="cri-o://ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558" gracePeriod=30 Apr 17 21:02:26.988995 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:26.988972 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:02:27.061548 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.061505 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cd002444-6404-4f4e-967e-bf82417df650-maas-api-tls\") pod \"cd002444-6404-4f4e-967e-bf82417df650\" (UID: \"cd002444-6404-4f4e-967e-bf82417df650\") " Apr 17 21:02:27.061718 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.061626 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vsrp\" (UniqueName: \"kubernetes.io/projected/cd002444-6404-4f4e-967e-bf82417df650-kube-api-access-8vsrp\") pod \"cd002444-6404-4f4e-967e-bf82417df650\" (UID: \"cd002444-6404-4f4e-967e-bf82417df650\") " Apr 17 21:02:27.063709 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.063675 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd002444-6404-4f4e-967e-bf82417df650-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "cd002444-6404-4f4e-967e-bf82417df650" (UID: "cd002444-6404-4f4e-967e-bf82417df650"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:02:27.063846 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.063763 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd002444-6404-4f4e-967e-bf82417df650-kube-api-access-8vsrp" (OuterVolumeSpecName: "kube-api-access-8vsrp") pod "cd002444-6404-4f4e-967e-bf82417df650" (UID: "cd002444-6404-4f4e-967e-bf82417df650"). InnerVolumeSpecName "kube-api-access-8vsrp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:02:27.163149 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.163073 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8vsrp\" (UniqueName: \"kubernetes.io/projected/cd002444-6404-4f4e-967e-bf82417df650-kube-api-access-8vsrp\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:02:27.163149 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.163103 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cd002444-6404-4f4e-967e-bf82417df650-maas-api-tls\") on node \"ip-10-0-130-66.ec2.internal\" DevicePath \"\"" Apr 17 21:02:27.390837 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.390794 2576 generic.go:358] "Generic (PLEG): container finished" podID="cd002444-6404-4f4e-967e-bf82417df650" containerID="ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558" exitCode=0 Apr 17 21:02:27.391038 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.390856 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" Apr 17 21:02:27.391038 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.390881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" event={"ID":"cd002444-6404-4f4e-967e-bf82417df650","Type":"ContainerDied","Data":"ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558"} Apr 17 21:02:27.391038 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.390923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-644f6c4f7d-9m5vf" event={"ID":"cd002444-6404-4f4e-967e-bf82417df650","Type":"ContainerDied","Data":"e8e9b6a6dd4bce2f9f597e352c5f2fab6c90617a030148baf2a6131119e2e9b8"} Apr 17 21:02:27.391038 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.390945 2576 scope.go:117] "RemoveContainer" containerID="ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558" Apr 17 21:02:27.392448 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.392426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" event={"ID":"0f91c53f-d373-40bc-b118-8725d5241312","Type":"ContainerStarted","Data":"314ceb11bb93f0d8dced38d0b4a2a495f24281a91d8719dcfd3de60504ec57b2"} Apr 17 21:02:27.400451 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.400425 2576 scope.go:117] "RemoveContainer" containerID="ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558" Apr 17 21:02:27.400744 ip-10-0-130-66 kubenswrapper[2576]: E0417 21:02:27.400724 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558\": container with ID starting with ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558 not found: ID does not exist" containerID="ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558" Apr 17 21:02:27.400815 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.400754 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558"} err="failed to get container status \"ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558\": rpc error: code = NotFound desc = could not find container \"ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558\": container with ID starting with ded837471706561f0b7a07b067053f3ebd79051b01480ae101c44f5449d91558 not found: ID does not exist" Apr 17 21:02:27.423781 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.423694 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-644f6c4f7d-9m5vf"] Apr 17 21:02:27.426516 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:27.426491 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-644f6c4f7d-9m5vf"] Apr 17 21:02:28.239906 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:28.239871 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd002444-6404-4f4e-967e-bf82417df650" path="/var/lib/kubelet/pods/cd002444-6404-4f4e-967e-bf82417df650/volumes" Apr 17 21:02:32.411959 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:32.411921 2576 generic.go:358] "Generic (PLEG): container finished" podID="0f91c53f-d373-40bc-b118-8725d5241312" containerID="314ceb11bb93f0d8dced38d0b4a2a495f24281a91d8719dcfd3de60504ec57b2" exitCode=0 Apr 17 21:02:32.412419 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:32.411983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" event={"ID":"0f91c53f-d373-40bc-b118-8725d5241312","Type":"ContainerDied","Data":"314ceb11bb93f0d8dced38d0b4a2a495f24281a91d8719dcfd3de60504ec57b2"} Apr 17 21:02:34.425213 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:34.425175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" event={"ID":"0f91c53f-d373-40bc-b118-8725d5241312","Type":"ContainerStarted","Data":"51b4fa0ef1dc21538d1f6d1c8569d5687c6b1ee2c68689b1f63db5cf36cc31ab"} Apr 17 21:02:34.425624 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:34.425413 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:34.443742 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:34.443692 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" podStartSLOduration=2.4353858 podStartE2EDuration="14.443678541s" podCreationTimestamp="2026-04-17 21:02:20 +0000 UTC" firstStartedPulling="2026-04-17 21:02:21.42210426 +0000 UTC m=+725.656570648" lastFinishedPulling="2026-04-17 21:02:33.430397004 +0000 UTC m=+737.664863389" observedRunningTime="2026-04-17 21:02:34.441686193 +0000 UTC m=+738.676152599" watchObservedRunningTime="2026-04-17 21:02:34.443678541 +0000 UTC m=+738.678144948" Apr 17 21:02:41.764983 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.764946 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95"] Apr 17 21:02:41.765379 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.765244 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd002444-6404-4f4e-967e-bf82417df650" containerName="maas-api" Apr 17 21:02:41.765379 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.765255 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd002444-6404-4f4e-967e-bf82417df650" containerName="maas-api" Apr 17 21:02:41.765379 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.765314 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd002444-6404-4f4e-967e-bf82417df650" containerName="maas-api" Apr 17 21:02:41.769071 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.769049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.771329 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.771305 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 21:02:41.778545 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.778522 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95"] Apr 17 21:02:41.890202 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.890168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9676abf9-799c-4a3b-9467-0602c29d5367-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.890384 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.890215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.890384 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.890280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.890384 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.890332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.890384 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.890375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.890522 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.890399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5pn6\" (UniqueName: \"kubernetes.io/projected/9676abf9-799c-4a3b-9467-0602c29d5367-kube-api-access-f5pn6\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.991699 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.991657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9676abf9-799c-4a3b-9467-0602c29d5367-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.991943 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.991709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.991943 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.991756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.991943 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.991797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.992109 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.991946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.992109 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.991989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5pn6\" (UniqueName: \"kubernetes.io/projected/9676abf9-799c-4a3b-9467-0602c29d5367-kube-api-access-f5pn6\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.992264 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.992240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.992501 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.992477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.993375 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.993184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.994813 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.994792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9676abf9-799c-4a3b-9467-0602c29d5367-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.994931 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.994885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9676abf9-799c-4a3b-9467-0602c29d5367-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:41.999706 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:41.999681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5pn6\" (UniqueName: \"kubernetes.io/projected/9676abf9-799c-4a3b-9467-0602c29d5367-kube-api-access-f5pn6\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jmr95\" (UID: \"9676abf9-799c-4a3b-9467-0602c29d5367\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:42.078656 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:42.078578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:42.200591 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:42.200564 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95"] Apr 17 21:02:42.203124 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:02:42.203096 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9676abf9_799c_4a3b_9467_0602c29d5367.slice/crio-616028fc2c4928ea097c68a4ee59465701ee5a94877a9b2bbaf9b1371f705f53 WatchSource:0}: Error finding container 616028fc2c4928ea097c68a4ee59465701ee5a94877a9b2bbaf9b1371f705f53: Status 404 returned error can't find the container with id 616028fc2c4928ea097c68a4ee59465701ee5a94877a9b2bbaf9b1371f705f53 Apr 17 21:02:42.453177 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:42.453144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" event={"ID":"9676abf9-799c-4a3b-9467-0602c29d5367","Type":"ContainerStarted","Data":"965bc22df68e944dc0d2859c235657d6be02b379a8d8c1a096b6f8f2eaed9936"} Apr 17 21:02:42.453177 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:42.453181 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" event={"ID":"9676abf9-799c-4a3b-9467-0602c29d5367","Type":"ContainerStarted","Data":"616028fc2c4928ea097c68a4ee59465701ee5a94877a9b2bbaf9b1371f705f53"} Apr 17 21:02:45.442070 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:45.442040 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-68qz7" Apr 17 21:02:50.489843 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:50.489752 2576 generic.go:358] "Generic (PLEG): container finished" podID="9676abf9-799c-4a3b-9467-0602c29d5367" containerID="965bc22df68e944dc0d2859c235657d6be02b379a8d8c1a096b6f8f2eaed9936" exitCode=0 Apr 17 21:02:50.490298 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:50.489835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" event={"ID":"9676abf9-799c-4a3b-9467-0602c29d5367","Type":"ContainerDied","Data":"965bc22df68e944dc0d2859c235657d6be02b379a8d8c1a096b6f8f2eaed9936"} Apr 17 21:02:51.494746 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:51.494692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" event={"ID":"9676abf9-799c-4a3b-9467-0602c29d5367","Type":"ContainerStarted","Data":"24943abf0e65a2a628d27b45d8129dd9e64ca02ec539965c7197de491d6ffa95"} Apr 17 21:02:51.495159 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:51.494921 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:02:51.513588 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:51.513527 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" podStartSLOduration=10.325016372 podStartE2EDuration="10.513509756s" podCreationTimestamp="2026-04-17 21:02:41 +0000 UTC" firstStartedPulling="2026-04-17 21:02:50.490592457 +0000 UTC m=+754.725058842" lastFinishedPulling="2026-04-17 21:02:50.679085558 +0000 UTC m=+754.913552226" observedRunningTime="2026-04-17 21:02:51.512834152 +0000 UTC m=+755.747300558" watchObservedRunningTime="2026-04-17 21:02:51.513509756 +0000 UTC m=+755.747976164" Apr 17 21:02:58.478402 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.478345 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx"] Apr 17 21:02:58.683942 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.683905 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx"] Apr 17 21:02:58.684103 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.684038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.686527 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.686501 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 21:02:58.842056 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.841934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.842056 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.842013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.842292 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.842062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfpw8\" (UniqueName: \"kubernetes.io/projected/1aa2d81a-f998-4b39-a777-7cc9802aebae-kube-api-access-kfpw8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.842292 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.842105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.842292 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.842145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa2d81a-f998-4b39-a777-7cc9802aebae-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.842292 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.842192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.942890 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.942853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.943091 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.942921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.943091 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.942951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.943091 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.942977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpw8\" (UniqueName: \"kubernetes.io/projected/1aa2d81a-f998-4b39-a777-7cc9802aebae-kube-api-access-kfpw8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.943091 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.943004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.943091 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.943032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa2d81a-f998-4b39-a777-7cc9802aebae-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.943551 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.943524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.943763 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.943736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.943857 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.943763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.945448 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.945428 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1aa2d81a-f998-4b39-a777-7cc9802aebae-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.945855 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.945834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa2d81a-f998-4b39-a777-7cc9802aebae-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.950689 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.950664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpw8\" (UniqueName: \"kubernetes.io/projected/1aa2d81a-f998-4b39-a777-7cc9802aebae-kube-api-access-kfpw8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx\" (UID: \"1aa2d81a-f998-4b39-a777-7cc9802aebae\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:58.994678 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:58.994640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:02:59.117048 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:59.117022 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx"] Apr 17 21:02:59.119659 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:02:59.119623 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa2d81a_f998_4b39_a777_7cc9802aebae.slice/crio-1df90a7662b3372d55c1f725c5ff4cba7269c8e8fd6de85d1f89a12f35d79c45 WatchSource:0}: Error finding container 1df90a7662b3372d55c1f725c5ff4cba7269c8e8fd6de85d1f89a12f35d79c45: Status 404 returned error can't find the container with id 1df90a7662b3372d55c1f725c5ff4cba7269c8e8fd6de85d1f89a12f35d79c45 Apr 17 21:02:59.121539 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:59.121518 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:02:59.521550 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:59.521510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" event={"ID":"1aa2d81a-f998-4b39-a777-7cc9802aebae","Type":"ContainerStarted","Data":"21532485957ce6369a9162fac196c7abff732e0701150026cc06bed922032cfe"} Apr 17 21:02:59.521550 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:02:59.521555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" event={"ID":"1aa2d81a-f998-4b39-a777-7cc9802aebae","Type":"ContainerStarted","Data":"1df90a7662b3372d55c1f725c5ff4cba7269c8e8fd6de85d1f89a12f35d79c45"} Apr 17 21:03:02.510871 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:02.510841 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jmr95" Apr 17 21:03:04.539301 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:04.539268 2576 generic.go:358] "Generic (PLEG): container finished" podID="1aa2d81a-f998-4b39-a777-7cc9802aebae" containerID="21532485957ce6369a9162fac196c7abff732e0701150026cc06bed922032cfe" exitCode=0 Apr 17 21:03:04.539755 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:04.539310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" event={"ID":"1aa2d81a-f998-4b39-a777-7cc9802aebae","Type":"ContainerDied","Data":"21532485957ce6369a9162fac196c7abff732e0701150026cc06bed922032cfe"} Apr 17 21:03:05.544252 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:05.544212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" event={"ID":"1aa2d81a-f998-4b39-a777-7cc9802aebae","Type":"ContainerStarted","Data":"f89e31cc4944c3b5f45f5ef43b1be332653b6d323a8c298bf434848f6080ab89"} Apr 17 21:03:05.544662 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:05.544439 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:03:05.562225 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:05.562177 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" podStartSLOduration=7.177671104 podStartE2EDuration="7.562164287s" podCreationTimestamp="2026-04-17 21:02:58 +0000 UTC" firstStartedPulling="2026-04-17 21:03:04.54001906 +0000 UTC m=+768.774485445" lastFinishedPulling="2026-04-17 21:03:04.924512229 +0000 UTC m=+769.158978628" observedRunningTime="2026-04-17 21:03:05.560962557 +0000 UTC m=+769.795428964" watchObservedRunningTime="2026-04-17 21:03:05.562164287 +0000 UTC m=+769.796630694" Apr 17 21:03:16.560833 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:16.560745 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx" Apr 17 21:03:54.152929 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.152892 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-58895d466f-j254z"] Apr 17 21:03:54.155218 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.155197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-58895d466f-j254z" Apr 17 21:03:54.158415 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.158398 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nx2nz\"" Apr 17 21:03:54.158482 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.158456 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 21:03:54.163267 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.163240 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-58895d466f-j254z"] Apr 17 21:03:54.300544 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.300515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/84df6f18-03c3-4a99-9950-cc82b710394e-tls-cert\") pod \"authorino-58895d466f-j254z\" (UID: \"84df6f18-03c3-4a99-9950-cc82b710394e\") " pod="kuadrant-system/authorino-58895d466f-j254z" Apr 17 21:03:54.300733 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.300556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbl77\" (UniqueName: \"kubernetes.io/projected/84df6f18-03c3-4a99-9950-cc82b710394e-kube-api-access-wbl77\") pod \"authorino-58895d466f-j254z\" (UID: \"84df6f18-03c3-4a99-9950-cc82b710394e\") " pod="kuadrant-system/authorino-58895d466f-j254z" Apr 17 21:03:54.401232 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.401201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/84df6f18-03c3-4a99-9950-cc82b710394e-tls-cert\") pod \"authorino-58895d466f-j254z\" (UID: \"84df6f18-03c3-4a99-9950-cc82b710394e\") " pod="kuadrant-system/authorino-58895d466f-j254z" Apr 17 21:03:54.401449 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.401240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbl77\" (UniqueName: \"kubernetes.io/projected/84df6f18-03c3-4a99-9950-cc82b710394e-kube-api-access-wbl77\") pod \"authorino-58895d466f-j254z\" (UID: \"84df6f18-03c3-4a99-9950-cc82b710394e\") " pod="kuadrant-system/authorino-58895d466f-j254z" Apr 17 21:03:54.404039 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.403965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/84df6f18-03c3-4a99-9950-cc82b710394e-tls-cert\") pod \"authorino-58895d466f-j254z\" (UID: \"84df6f18-03c3-4a99-9950-cc82b710394e\") " pod="kuadrant-system/authorino-58895d466f-j254z" Apr 17 21:03:54.408437 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.408413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbl77\" (UniqueName: \"kubernetes.io/projected/84df6f18-03c3-4a99-9950-cc82b710394e-kube-api-access-wbl77\") pod \"authorino-58895d466f-j254z\" (UID: \"84df6f18-03c3-4a99-9950-cc82b710394e\") " pod="kuadrant-system/authorino-58895d466f-j254z" Apr 17 21:03:54.465230 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.465195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-58895d466f-j254z" Apr 17 21:03:54.587452 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.587424 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-58895d466f-j254z"] Apr 17 21:03:54.589962 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:03:54.589934 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84df6f18_03c3_4a99_9950_cc82b710394e.slice/crio-2a59b23a00f47031b5625abf72cd53f49fc0347670f1de470da5e275fb6e538c WatchSource:0}: Error finding container 2a59b23a00f47031b5625abf72cd53f49fc0347670f1de470da5e275fb6e538c: Status 404 returned error can't find the container with id 2a59b23a00f47031b5625abf72cd53f49fc0347670f1de470da5e275fb6e538c Apr 17 21:03:54.703767 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:54.703729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-58895d466f-j254z" event={"ID":"84df6f18-03c3-4a99-9950-cc82b710394e","Type":"ContainerStarted","Data":"2a59b23a00f47031b5625abf72cd53f49fc0347670f1de470da5e275fb6e538c"} Apr 17 21:03:55.708587 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:55.708550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-58895d466f-j254z" event={"ID":"84df6f18-03c3-4a99-9950-cc82b710394e","Type":"ContainerStarted","Data":"a727d357eed83a70e628cb610e4e56325893b7b457436cba6eea61cc833e3676"} Apr 17 21:03:55.728619 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:03:55.728574 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-58895d466f-j254z" podStartSLOduration=1.2788376590000001 podStartE2EDuration="1.72855929s" podCreationTimestamp="2026-04-17 21:03:54 +0000 UTC" firstStartedPulling="2026-04-17 21:03:54.591210749 +0000 UTC m=+818.825677134" lastFinishedPulling="2026-04-17 21:03:55.040932379 +0000 UTC m=+819.275398765" observedRunningTime="2026-04-17 21:03:55.726550935 +0000 UTC m=+819.961017343" watchObservedRunningTime="2026-04-17 21:03:55.72855929 +0000 UTC m=+819.963025697" Apr 17 21:05:16.220688 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:05:16.220610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 21:05:16.221221 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:05:16.221097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 21:06:08.019098 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:08.019065 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-58895d466f-j254z_84df6f18-03c3-4a99-9950-cc82b710394e/authorino/0.log" Apr 17 21:06:11.907490 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:11.907457 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-6d87747d66-j7w9t_50ba2daf-7a9f-4259-8de3-72e124c3f4e1/manager/0.log" Apr 17 21:06:12.348411 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:12.348378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj_37e19582-7399-462d-8bb9-575954b406de/manager/0.log" Apr 17 21:06:13.622840 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:13.622808 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-58895d466f-j254z_84df6f18-03c3-4a99-9950-cc82b710394e/authorino/0.log" Apr 17 21:06:13.730841 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:13.730811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-vjkk6_cbb471c8-b495-4f9d-bb0c-73b1cedda753/manager/0.log" Apr 17 21:06:13.834242 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:13.834207 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-tf489_68c11c08-7d9f-4a99-b37b-3653753db8d2/manager/0.log" Apr 17 21:06:14.043732 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:14.043704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-hqrnw_6f5eb69d-710e-4362-ba09-d3f4c87c5124/registry-server/0.log" Apr 17 21:06:14.366616 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:14.366536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-hntcd_7a2d4136-2b7f-4bc2-b986-b2baf1642332/manager/0.log" Apr 17 21:06:14.686610 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:14.686526 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs_1c2464fc-7069-43f6-b2cf-7e7d792a9f27/istio-proxy/0.log" Apr 17 21:06:15.774750 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:15.774713 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-68qz7_0f91c53f-d373-40bc-b118-8725d5241312/storage-initializer/0.log" Apr 17 21:06:15.780967 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:15.780951 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-68qz7_0f91c53f-d373-40bc-b118-8725d5241312/main/0.log" Apr 17 21:06:15.884757 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:15.884729 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx_1aa2d81a-f998-4b39-a777-7cc9802aebae/storage-initializer/0.log" Apr 17 21:06:15.891306 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:15.891288 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccwrmxx_1aa2d81a-f998-4b39-a777-7cc9802aebae/main/0.log" Apr 17 21:06:16.109773 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:16.109699 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-jmr95_9676abf9-799c-4a3b-9467-0602c29d5367/storage-initializer/0.log" Apr 17 21:06:16.116303 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:16.116270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-jmr95_9676abf9-799c-4a3b-9467-0602c29d5367/main/0.log" Apr 17 21:06:22.605309 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:22.605275 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-c6jkl_7e49bff1-efa9-421a-8ee8-f431f0f0c109/global-pull-secret-syncer/0.log" Apr 17 21:06:22.741124 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:22.741092 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gd4dw_10783c15-a601-4d72-90a5-870dc70d9889/konnectivity-agent/0.log" Apr 17 21:06:22.783075 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:22.783041 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-66.ec2.internal_3b8513526e5cd2b3c1dca895a37fc635/haproxy/0.log" Apr 17 21:06:26.588480 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:26.588450 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-58895d466f-j254z_84df6f18-03c3-4a99-9950-cc82b710394e/authorino/0.log" Apr 17 21:06:26.617711 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:26.617685 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-vjkk6_cbb471c8-b495-4f9d-bb0c-73b1cedda753/manager/0.log" Apr 17 21:06:26.645133 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:26.645104 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-tf489_68c11c08-7d9f-4a99-b37b-3653753db8d2/manager/0.log" Apr 17 21:06:26.703607 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:26.703572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-hqrnw_6f5eb69d-710e-4362-ba09-d3f4c87c5124/registry-server/0.log" Apr 17 21:06:26.798172 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:26.798141 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-hntcd_7a2d4136-2b7f-4bc2-b986-b2baf1642332/manager/0.log" Apr 17 21:06:28.462177 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:28.462150 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8wvj_b04ae14e-8db1-4d80-bdfa-810545a2b5ef/node-exporter/0.log" Apr 17 21:06:28.488928 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:28.488906 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8wvj_b04ae14e-8db1-4d80-bdfa-810545a2b5ef/kube-rbac-proxy/0.log" Apr 17 21:06:28.513044 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:28.513013 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8wvj_b04ae14e-8db1-4d80-bdfa-810545a2b5ef/init-textfile/0.log" Apr 17 21:06:30.376918 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:30.376888 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-cs29g_c0f5afe7-3f63-49d2-8f14-97de5a47e278/networking-console-plugin/0.log" Apr 17 21:06:31.416203 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.416163 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5"] Apr 17 21:06:31.418595 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.418572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.420830 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.420807 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mxv2s\"/\"kube-root-ca.crt\"" Apr 17 21:06:31.421712 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.421694 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mxv2s\"/\"default-dockercfg-kt5z7\"" Apr 17 21:06:31.421827 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.421741 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mxv2s\"/\"openshift-service-ca.crt\"" Apr 17 21:06:31.425844 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.425822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5"] Apr 17 21:06:31.526374 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.526327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-podres\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.526559 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.526409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-lib-modules\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.526559 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.526446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-sys\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.526559 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.526469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-proc\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.526664 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.526558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpx8l\" (UniqueName: \"kubernetes.io/projected/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-kube-api-access-fpx8l\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627607 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-podres\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627607 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-lib-modules\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627838 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-sys\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627838 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-proc\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627838 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpx8l\" (UniqueName: \"kubernetes.io/projected/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-kube-api-access-fpx8l\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627838 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627747 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-podres\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627838 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-sys\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627838 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-proc\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.627838 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.627815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-lib-modules\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.635748 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.635722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpx8l\" (UniqueName: \"kubernetes.io/projected/ece9c2fc-47f6-4e96-8cb8-94a1fd130592-kube-api-access-fpx8l\") pod \"perf-node-gather-daemonset-dl7v5\" (UID: \"ece9c2fc-47f6-4e96-8cb8-94a1fd130592\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.729033 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.728995 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:31.855211 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:31.855120 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5"] Apr 17 21:06:31.858248 ip-10-0-130-66 kubenswrapper[2576]: W0417 21:06:31.858223 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podece9c2fc_47f6_4e96_8cb8_94a1fd130592.slice/crio-d13474dca7b963eb054e1dc4c3636e8b650f1e456e032beb76f95e1214fa9cc1 WatchSource:0}: Error finding container d13474dca7b963eb054e1dc4c3636e8b650f1e456e032beb76f95e1214fa9cc1: Status 404 returned error can't find the container with id d13474dca7b963eb054e1dc4c3636e8b650f1e456e032beb76f95e1214fa9cc1 Apr 17 21:06:32.233293 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:32.233259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" event={"ID":"ece9c2fc-47f6-4e96-8cb8-94a1fd130592","Type":"ContainerStarted","Data":"a232035f4b0a8dbbc3d6c5ae65285350474bdc94ef7a0db606e82341d1ab4bc8"} Apr 17 21:06:32.233293 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:32.233297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" event={"ID":"ece9c2fc-47f6-4e96-8cb8-94a1fd130592","Type":"ContainerStarted","Data":"d13474dca7b963eb054e1dc4c3636e8b650f1e456e032beb76f95e1214fa9cc1"} Apr 17 21:06:32.233562 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:32.233390 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:32.247048 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:32.246997 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" podStartSLOduration=1.2469821 podStartE2EDuration="1.2469821s" podCreationTimestamp="2026-04-17 21:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:06:32.246588795 +0000 UTC m=+976.481055203" watchObservedRunningTime="2026-04-17 21:06:32.2469821 +0000 UTC m=+976.481448506" Apr 17 21:06:32.756289 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:32.756260 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kx6mg_dea8814d-46ce-4135-a1ce-f5b8ff97088a/dns/0.log" Apr 17 21:06:32.776150 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:32.776128 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kx6mg_dea8814d-46ce-4135-a1ce-f5b8ff97088a/kube-rbac-proxy/0.log" Apr 17 21:06:32.944025 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:32.943984 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qcw6q_f96c4ba0-6cee-4727-bef7-248a0da4b215/dns-node-resolver/0.log" Apr 17 21:06:33.451391 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:33.451346 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f2pm8_bf8a6449-19c6-469b-9f9c-049cc3f220b8/node-ca/0.log" Apr 17 21:06:34.274600 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:34.274574 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf7dphs_1c2464fc-7069-43f6-b2cf-7e7d792a9f27/istio-proxy/0.log" Apr 17 21:06:34.898126 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:34.898100 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dpkwf_3bd339db-87cf-44af-8c74-4c5f57f80ccc/serve-healthcheck-canary/0.log" Apr 17 21:06:35.515319 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:35.515277 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dpp5j_3f030b7e-da00-454b-9b12-10b15cc9e274/kube-rbac-proxy/0.log" Apr 17 21:06:35.538098 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:35.538071 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dpp5j_3f030b7e-da00-454b-9b12-10b15cc9e274/exporter/0.log" Apr 17 21:06:35.559247 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:35.559223 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dpp5j_3f030b7e-da00-454b-9b12-10b15cc9e274/extractor/0.log" Apr 17 21:06:37.385988 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:37.385957 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-6d87747d66-j7w9t_50ba2daf-7a9f-4259-8de3-72e124c3f4e1/manager/0.log" Apr 17 21:06:37.497211 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:37.497184 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f74b9c8f9-qd4jj_37e19582-7399-462d-8bb9-575954b406de/manager/0.log" Apr 17 21:06:38.247396 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:38.247339 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-dl7v5" Apr 17 21:06:38.559876 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:38.559798 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7bd8bcccff-fqj79_5ce3fe5b-22e6-491f-9eed-cb71671ee9c0/manager/0.log" Apr 17 21:06:44.426986 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.426952 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rv6lj_1876416b-79dd-4ff1-88a4-b7111c5e304d/kube-multus-additional-cni-plugins/0.log" Apr 17 21:06:44.447239 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.447210 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rv6lj_1876416b-79dd-4ff1-88a4-b7111c5e304d/egress-router-binary-copy/0.log" Apr 17 21:06:44.467437 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.467407 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rv6lj_1876416b-79dd-4ff1-88a4-b7111c5e304d/cni-plugins/0.log" Apr 17 21:06:44.487572 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.487544 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rv6lj_1876416b-79dd-4ff1-88a4-b7111c5e304d/bond-cni-plugin/0.log" Apr 17 21:06:44.507282 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.507257 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rv6lj_1876416b-79dd-4ff1-88a4-b7111c5e304d/routeoverride-cni/0.log" Apr 17 21:06:44.527290 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.527267 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rv6lj_1876416b-79dd-4ff1-88a4-b7111c5e304d/whereabouts-cni-bincopy/0.log" Apr 17 21:06:44.573293 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.573263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rv6lj_1876416b-79dd-4ff1-88a4-b7111c5e304d/whereabouts-cni/0.log" Apr 17 21:06:44.747942 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.747918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cj26q_fc1a4043-1274-42fb-ade0-e46458e332ce/kube-multus/0.log" Apr 17 21:06:44.803652 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.803605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cxq8r_b1da9568-78d7-4d7f-93b4-33b608a48c41/network-metrics-daemon/0.log" Apr 17 21:06:44.822812 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:44.822778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cxq8r_b1da9568-78d7-4d7f-93b4-33b608a48c41/kube-rbac-proxy/0.log" Apr 17 21:06:45.697228 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.697195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-controller/0.log" Apr 17 21:06:45.714543 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.714516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/0.log" Apr 17 21:06:45.719009 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.718989 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovn-acl-logging/1.log" Apr 17 21:06:45.738461 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.738429 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/kube-rbac-proxy-node/0.log" Apr 17 21:06:45.758027 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.757990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 21:06:45.776279 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.776246 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/northd/0.log" Apr 17 21:06:45.796245 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.796220 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/nbdb/0.log" Apr 17 21:06:45.821563 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.821537 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/sbdb/0.log" Apr 17 21:06:45.917813 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:45.917781 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vzmr_82f6c12a-75ed-42b7-8c6c-bc314957ec1f/ovnkube-controller/0.log" Apr 17 21:06:47.623077 ip-10-0-130-66 kubenswrapper[2576]: I0417 21:06:47.623045 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rjt2l_9e571f60-0b76-435b-aac4-aada6990b2b3/network-check-target-container/0.log"